Mar 19 11:50:54.762144 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 19 11:50:55.351518 master-0 kubenswrapper[4029]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 11:50:55.351518 master-0 kubenswrapper[4029]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 19 11:50:55.351518 master-0 kubenswrapper[4029]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 11:50:55.353706 master-0 kubenswrapper[4029]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 11:50:55.353706 master-0 kubenswrapper[4029]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 19 11:50:55.353706 master-0 kubenswrapper[4029]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 11:50:55.355574 master-0 kubenswrapper[4029]: I0319 11:50:55.355400 4029 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 19 11:50:55.361472 master-0 kubenswrapper[4029]: W0319 11:50:55.361441 4029 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 11:50:55.361472 master-0 kubenswrapper[4029]: W0319 11:50:55.361460 4029 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 11:50:55.361472 master-0 kubenswrapper[4029]: W0319 11:50:55.361465 4029 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 11:50:55.361472 master-0 kubenswrapper[4029]: W0319 11:50:55.361470 4029 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 11:50:55.361472 master-0 kubenswrapper[4029]: W0319 11:50:55.361476 4029 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 11:50:55.361601 master-0 kubenswrapper[4029]: W0319 11:50:55.361481 4029 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 11:50:55.361601 master-0 kubenswrapper[4029]: W0319 11:50:55.361487 4029 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 11:50:55.361601 master-0 kubenswrapper[4029]: W0319 11:50:55.361491 4029 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 11:50:55.361601 master-0 kubenswrapper[4029]: W0319 11:50:55.361496 4029 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 11:50:55.361601 master-0 kubenswrapper[4029]: W0319 11:50:55.361501 4029 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 11:50:55.361601 master-0 kubenswrapper[4029]: W0319 11:50:55.361505 4029 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 11:50:55.361601 master-0 kubenswrapper[4029]: W0319 11:50:55.361509 4029 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 11:50:55.361601 master-0 kubenswrapper[4029]: W0319 11:50:55.361517 4029 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 11:50:55.361601 master-0 kubenswrapper[4029]: W0319 11:50:55.361522 4029 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 11:50:55.361601 master-0 kubenswrapper[4029]: W0319 11:50:55.361526 4029 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 11:50:55.361601 master-0 kubenswrapper[4029]: W0319 11:50:55.361530 4029 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 11:50:55.361601 master-0 kubenswrapper[4029]: W0319 11:50:55.361534 4029 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 11:50:55.361601 master-0 kubenswrapper[4029]: W0319 11:50:55.361538 4029 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 11:50:55.361601 master-0 kubenswrapper[4029]: W0319 11:50:55.361543 4029 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 11:50:55.361601 master-0 kubenswrapper[4029]: W0319 11:50:55.361548 4029 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 11:50:55.361601 master-0 kubenswrapper[4029]: W0319 11:50:55.361552 4029 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 11:50:55.361601 master-0 kubenswrapper[4029]: W0319 11:50:55.361556 4029 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 11:50:55.361601 master-0 kubenswrapper[4029]: W0319 11:50:55.361560 4029 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 11:50:55.361601 master-0 kubenswrapper[4029]: W0319 11:50:55.361564 4029 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 11:50:55.362000 master-0 kubenswrapper[4029]: W0319 11:50:55.361571 4029 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 11:50:55.362000 master-0 kubenswrapper[4029]: W0319 11:50:55.361577 4029 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 11:50:55.362000 master-0 kubenswrapper[4029]: W0319 11:50:55.361582 4029 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 11:50:55.362000 master-0 kubenswrapper[4029]: W0319 11:50:55.361587 4029 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 11:50:55.362000 master-0 kubenswrapper[4029]: W0319 11:50:55.361591 4029 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 11:50:55.362000 master-0 kubenswrapper[4029]: W0319 11:50:55.361596 4029 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 11:50:55.362000 master-0 kubenswrapper[4029]: W0319 11:50:55.361602 4029 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 11:50:55.362000 master-0 kubenswrapper[4029]: W0319 11:50:55.361606 4029 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 11:50:55.362000 master-0 kubenswrapper[4029]: W0319 11:50:55.361610 4029 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 11:50:55.362000 master-0 kubenswrapper[4029]: W0319 11:50:55.361615 4029 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 11:50:55.362000 master-0 kubenswrapper[4029]: W0319 11:50:55.361619 4029 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 11:50:55.362000 master-0 kubenswrapper[4029]: W0319 11:50:55.361624 4029 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 11:50:55.362000 master-0 kubenswrapper[4029]: W0319 11:50:55.361628 4029 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 11:50:55.362000 master-0 kubenswrapper[4029]: W0319 11:50:55.361632 4029 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 11:50:55.362000 master-0 kubenswrapper[4029]: W0319 11:50:55.361637 4029 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 11:50:55.362000 master-0 kubenswrapper[4029]: W0319 11:50:55.361641 4029 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 11:50:55.362000 master-0 kubenswrapper[4029]: W0319 11:50:55.361646 4029 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 11:50:55.362000 master-0 kubenswrapper[4029]: W0319 11:50:55.361653 4029 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 11:50:55.362000 master-0 kubenswrapper[4029]: W0319 11:50:55.361659 4029 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 11:50:55.362378 master-0 kubenswrapper[4029]: W0319 11:50:55.361663 4029 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 11:50:55.362378 master-0 kubenswrapper[4029]: W0319 11:50:55.361667 4029 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 11:50:55.362378 master-0 kubenswrapper[4029]: W0319 11:50:55.361672 4029 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 11:50:55.362378 master-0 kubenswrapper[4029]: W0319 11:50:55.361677 4029 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 11:50:55.362378 master-0 kubenswrapper[4029]: W0319 11:50:55.361682 4029 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 11:50:55.362378 master-0 kubenswrapper[4029]: W0319 11:50:55.361687 4029 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 11:50:55.362378 master-0 kubenswrapper[4029]: W0319 11:50:55.361691 4029 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 11:50:55.362378 master-0 kubenswrapper[4029]: W0319 11:50:55.361698 4029 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 11:50:55.362378 master-0 kubenswrapper[4029]: W0319 11:50:55.361703 4029 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 11:50:55.362378 master-0 kubenswrapper[4029]: W0319 11:50:55.361708 4029 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 11:50:55.362378 master-0 kubenswrapper[4029]: W0319 11:50:55.361714 4029 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 11:50:55.362378 master-0 kubenswrapper[4029]: W0319 11:50:55.361718 4029 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 11:50:55.362378 master-0 kubenswrapper[4029]: W0319 11:50:55.361738 4029 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 11:50:55.362378 master-0 kubenswrapper[4029]: W0319 11:50:55.361743 4029 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 11:50:55.362378 master-0 kubenswrapper[4029]: W0319 11:50:55.361748 4029 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 11:50:55.362378 master-0 kubenswrapper[4029]: W0319 11:50:55.361753 4029 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 11:50:55.362378 master-0 kubenswrapper[4029]: W0319 11:50:55.361758 4029 feature_gate.go:330] unrecognized feature gate: Example Mar 19 11:50:55.362378 master-0 kubenswrapper[4029]: W0319 11:50:55.361762 4029 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 11:50:55.362378 master-0 kubenswrapper[4029]: W0319 11:50:55.361766 4029 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 11:50:55.362378 master-0 kubenswrapper[4029]: W0319 11:50:55.361772 4029 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 11:50:55.362843 master-0 kubenswrapper[4029]: W0319 11:50:55.361776 4029 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 11:50:55.362843 master-0 kubenswrapper[4029]: W0319 11:50:55.361780 4029 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 11:50:55.362843 master-0 kubenswrapper[4029]: W0319 11:50:55.361785 4029 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 11:50:55.362843 master-0 kubenswrapper[4029]: W0319 11:50:55.361790 4029 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 11:50:55.362843 master-0 kubenswrapper[4029]: W0319 11:50:55.361795 4029 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 11:50:55.362843 master-0 kubenswrapper[4029]: W0319 11:50:55.361801 4029 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 11:50:55.362843 master-0 kubenswrapper[4029]: W0319 11:50:55.361807 4029 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 11:50:55.362843 master-0 kubenswrapper[4029]: W0319 11:50:55.361812 4029 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 11:50:55.362843 master-0 kubenswrapper[4029]: W0319 11:50:55.361818 4029 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 11:50:55.362843 master-0 kubenswrapper[4029]: I0319 11:50:55.361923 4029 flags.go:64] FLAG: --address="0.0.0.0" Mar 19 11:50:55.362843 master-0 kubenswrapper[4029]: I0319 11:50:55.361935 4029 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 19 11:50:55.362843 master-0 kubenswrapper[4029]: I0319 11:50:55.361949 4029 flags.go:64] FLAG: --anonymous-auth="true" Mar 19 11:50:55.362843 master-0 kubenswrapper[4029]: I0319 11:50:55.361956 4029 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 19 11:50:55.362843 master-0 kubenswrapper[4029]: I0319 11:50:55.361963 4029 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 19 11:50:55.362843 master-0 kubenswrapper[4029]: I0319 11:50:55.361970 4029 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 19 11:50:55.362843 master-0 kubenswrapper[4029]: I0319 11:50:55.361977 4029 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 19 11:50:55.362843 master-0 kubenswrapper[4029]: I0319 11:50:55.361984 4029 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 19 11:50:55.362843 master-0 kubenswrapper[4029]: I0319 11:50:55.361989 4029 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 19 11:50:55.362843 master-0 kubenswrapper[4029]: I0319 11:50:55.361995 4029 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 19 11:50:55.362843 master-0 kubenswrapper[4029]: I0319 11:50:55.362001 4029 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 19 11:50:55.362843 master-0 kubenswrapper[4029]: I0319 11:50:55.362007 4029 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 19 11:50:55.362843 master-0 kubenswrapper[4029]: I0319 11:50:55.362013 4029 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362019 4029 flags.go:64] FLAG: --cgroup-root="" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362024 4029 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362031 4029 flags.go:64] FLAG: --client-ca-file="" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362036 4029 flags.go:64] FLAG: --cloud-config="" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362042 4029 flags.go:64] FLAG: --cloud-provider="" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362047 4029 flags.go:64] FLAG: --cluster-dns="[]" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362054 4029 flags.go:64] FLAG: --cluster-domain="" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362058 4029 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362064 4029 flags.go:64] FLAG: --config-dir="" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362070 4029 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362076 4029 flags.go:64] FLAG: --container-log-max-files="5" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362084 4029 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362089 4029 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362094 4029 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362100 4029 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362105 4029 flags.go:64] FLAG: --contention-profiling="false" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362110 4029 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362115 4029 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362122 4029 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362126 4029 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362133 4029 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362138 4029 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362144 4029 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362150 4029 flags.go:64] FLAG: --enable-load-reader="false" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362154 4029 flags.go:64] FLAG: --enable-server="true" Mar 19 11:50:55.363357 master-0 kubenswrapper[4029]: I0319 11:50:55.362159 4029 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362167 4029 flags.go:64] FLAG: --event-burst="100" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362172 4029 flags.go:64] FLAG: --event-qps="50" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362177 4029 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362183 4029 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362188 4029 flags.go:64] FLAG: --eviction-hard="" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362195 4029 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362200 4029 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362205 4029 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362211 4029 flags.go:64] FLAG: --eviction-soft="" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362216 4029 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362222 4029 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362227 4029 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362233 4029 flags.go:64] FLAG: --experimental-mounter-path="" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362238 4029 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362243 4029 flags.go:64] FLAG: --fail-swap-on="true" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362248 4029 flags.go:64] FLAG: --feature-gates="" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362255 4029 flags.go:64] FLAG: --file-check-frequency="20s" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362260 4029 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362266 4029 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362272 4029 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362277 4029 flags.go:64] FLAG: --healthz-port="10248" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362283 4029 flags.go:64] FLAG: --help="false" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362287 4029 flags.go:64] FLAG: --hostname-override="" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362292 4029 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362298 4029 flags.go:64] FLAG: --http-check-frequency="20s" Mar 19 11:50:55.363934 master-0 kubenswrapper[4029]: I0319 11:50:55.362303 4029 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 19 11:50:55.364443 master-0 kubenswrapper[4029]: I0319 11:50:55.362307 4029 flags.go:64] FLAG: --image-credential-provider-config="" Mar 19 11:50:55.364443 master-0 kubenswrapper[4029]: I0319 11:50:55.362312 4029 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 19 11:50:55.364443 master-0 kubenswrapper[4029]: I0319 11:50:55.362318 4029 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 19 11:50:55.364443 master-0 kubenswrapper[4029]: I0319 11:50:55.362323 4029 flags.go:64] FLAG: --image-service-endpoint="" Mar 19 11:50:55.364443 master-0 kubenswrapper[4029]: I0319 11:50:55.362328 4029 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 19 11:50:55.364443 master-0 kubenswrapper[4029]: I0319 11:50:55.362333 4029 flags.go:64] FLAG: --kube-api-burst="100" Mar 19 11:50:55.364443 master-0 kubenswrapper[4029]: I0319 11:50:55.362338 4029 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 19 11:50:55.364443 master-0 kubenswrapper[4029]: I0319 11:50:55.362343 4029 flags.go:64] FLAG: --kube-api-qps="50" Mar 19 11:50:55.364443 master-0 kubenswrapper[4029]: I0319 11:50:55.362348 4029 flags.go:64] FLAG: --kube-reserved="" Mar 19 11:50:55.364443 master-0 kubenswrapper[4029]: I0319 11:50:55.362353 4029 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 19 11:50:55.364443 master-0 kubenswrapper[4029]: I0319 11:50:55.362357 4029 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 19 11:50:55.364443 master-0 kubenswrapper[4029]: I0319 11:50:55.362362 4029 flags.go:64] FLAG: --kubelet-cgroups="" Mar 19 11:50:55.364443 master-0 kubenswrapper[4029]: I0319 11:50:55.362367 4029 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 19 11:50:55.364443 master-0 kubenswrapper[4029]: I0319 11:50:55.362372 4029 flags.go:64] FLAG: --lock-file="" Mar 19 11:50:55.364443 master-0 kubenswrapper[4029]: I0319 11:50:55.362377 4029 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 19 11:50:55.364443 master-0 kubenswrapper[4029]: I0319 11:50:55.362383 4029 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 19 11:50:55.364443 master-0 kubenswrapper[4029]: I0319 11:50:55.362389 4029 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 19 11:50:55.364443 master-0 kubenswrapper[4029]: I0319 11:50:55.362398 4029 flags.go:64] FLAG: --log-json-split-stream="false" Mar 19 11:50:55.364443 master-0 kubenswrapper[4029]: I0319 11:50:55.362404 4029 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 19 11:50:55.364443 master-0 kubenswrapper[4029]: I0319 11:50:55.362409 4029 flags.go:64] FLAG: --log-text-split-stream="false" Mar 19 11:50:55.364443 master-0 kubenswrapper[4029]: I0319 11:50:55.362413 4029 flags.go:64] FLAG: --logging-format="text" Mar 19 11:50:55.364443 master-0 kubenswrapper[4029]: I0319 11:50:55.362419 4029 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 19 11:50:55.364443 master-0 kubenswrapper[4029]: I0319 11:50:55.362424 4029 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 19 11:50:55.364443 master-0 kubenswrapper[4029]: I0319 11:50:55.362429 4029 flags.go:64] FLAG: --manifest-url="" Mar 19 11:50:55.364443 master-0 kubenswrapper[4029]: I0319 11:50:55.362434 4029 flags.go:64] FLAG: --manifest-url-header="" Mar 19 11:50:55.364995 master-0 kubenswrapper[4029]: I0319 11:50:55.362441 4029 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 19 11:50:55.364995 master-0 kubenswrapper[4029]: I0319 11:50:55.362446 4029 flags.go:64] FLAG: --max-open-files="1000000" Mar 19 11:50:55.364995 master-0 kubenswrapper[4029]: I0319 11:50:55.362453 4029 flags.go:64] FLAG: --max-pods="110" Mar 19 11:50:55.364995 master-0 kubenswrapper[4029]: I0319 11:50:55.362457 4029 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 19 11:50:55.364995 master-0 kubenswrapper[4029]: I0319 11:50:55.362463 4029 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 19 11:50:55.364995 master-0 kubenswrapper[4029]: I0319 11:50:55.362468 4029 flags.go:64] FLAG: --memory-manager-policy="None" Mar 19 11:50:55.364995 master-0 kubenswrapper[4029]: I0319 11:50:55.362473 4029 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 19 11:50:55.364995 master-0 kubenswrapper[4029]: I0319 11:50:55.362478 4029 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 19 11:50:55.364995 master-0 kubenswrapper[4029]: I0319 11:50:55.362482 4029 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 19 11:50:55.364995 master-0 kubenswrapper[4029]: I0319 11:50:55.362487 4029 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 19 11:50:55.364995 master-0 kubenswrapper[4029]: I0319 11:50:55.362500 4029 flags.go:64] FLAG: --node-status-max-images="50" Mar 19 11:50:55.364995 master-0 kubenswrapper[4029]: I0319 11:50:55.362505 4029 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 19 11:50:55.364995 master-0 kubenswrapper[4029]: I0319 11:50:55.362510 4029 flags.go:64] FLAG: --oom-score-adj="-999" Mar 19 11:50:55.364995 master-0 kubenswrapper[4029]: I0319 11:50:55.362516 4029 flags.go:64] FLAG: --pod-cidr="" Mar 19 11:50:55.364995 master-0 kubenswrapper[4029]: I0319 11:50:55.362521 4029 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:53d66d524ca3e787d8dbe30dbc4d9b8612c9cebd505ccb4375a8441814e85422" Mar 19 11:50:55.364995 master-0 kubenswrapper[4029]: I0319 11:50:55.362530 4029 flags.go:64] FLAG: --pod-manifest-path="" Mar 19 11:50:55.364995 master-0 kubenswrapper[4029]: I0319 11:50:55.362535 4029 flags.go:64] FLAG: --pod-max-pids="-1" Mar 19 11:50:55.364995 master-0 kubenswrapper[4029]: I0319 11:50:55.362540 4029 flags.go:64] FLAG: --pods-per-core="0" Mar 19 11:50:55.364995 master-0 kubenswrapper[4029]: I0319 11:50:55.362545 4029 flags.go:64] FLAG: --port="10250" Mar 19 11:50:55.364995 master-0 kubenswrapper[4029]: I0319 11:50:55.362550 4029 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 19 11:50:55.364995 master-0 kubenswrapper[4029]: I0319 11:50:55.362555 4029 flags.go:64] FLAG: --provider-id="" Mar 19 11:50:55.364995 master-0 kubenswrapper[4029]: I0319 11:50:55.362560 4029 flags.go:64] FLAG: --qos-reserved="" Mar 19 11:50:55.364995 master-0 kubenswrapper[4029]: I0319 11:50:55.362565 4029 flags.go:64] FLAG: --read-only-port="10255" Mar 19 11:50:55.364995 master-0 kubenswrapper[4029]: I0319 11:50:55.362572 4029 flags.go:64] FLAG: --register-node="true" Mar 19 11:50:55.365465 master-0 kubenswrapper[4029]: I0319 11:50:55.362578 4029 flags.go:64] FLAG: --register-schedulable="true" Mar 19 11:50:55.365465 master-0 kubenswrapper[4029]: I0319 11:50:55.362583 4029 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 19 11:50:55.365465 master-0 kubenswrapper[4029]: I0319 11:50:55.362593 4029 flags.go:64] FLAG: --registry-burst="10" Mar 19 11:50:55.365465 master-0 kubenswrapper[4029]: I0319 11:50:55.362598 4029 flags.go:64] FLAG: --registry-qps="5" Mar 19 11:50:55.365465 master-0 kubenswrapper[4029]: I0319 11:50:55.362605 4029 flags.go:64] FLAG: --reserved-cpus="" Mar 19 11:50:55.365465 master-0 kubenswrapper[4029]: I0319 11:50:55.362610 4029 flags.go:64] FLAG: --reserved-memory="" Mar 19 11:50:55.365465 master-0 kubenswrapper[4029]: I0319 11:50:55.362616 4029 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 19 11:50:55.365465 master-0 kubenswrapper[4029]: I0319 11:50:55.362622 4029 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 19 11:50:55.365465 master-0 kubenswrapper[4029]: I0319 11:50:55.362627 4029 flags.go:64] FLAG: --rotate-certificates="false" Mar 19 11:50:55.365465 master-0 kubenswrapper[4029]: I0319 11:50:55.362632 4029 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 19 11:50:55.365465 master-0 kubenswrapper[4029]: I0319 11:50:55.362638 4029 flags.go:64] FLAG: --runonce="false" Mar 19 11:50:55.365465 master-0 kubenswrapper[4029]: I0319 11:50:55.362642 4029 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 19 11:50:55.365465 master-0 kubenswrapper[4029]: I0319 11:50:55.362648 4029 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 19 11:50:55.365465 master-0 kubenswrapper[4029]: I0319 11:50:55.362653 4029 flags.go:64] FLAG: --seccomp-default="false" Mar 19 11:50:55.365465 master-0 kubenswrapper[4029]: I0319 11:50:55.362658 4029 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 19 11:50:55.365465 master-0 kubenswrapper[4029]: I0319 11:50:55.362663 4029 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 19 11:50:55.365465 master-0 kubenswrapper[4029]: I0319 11:50:55.362668 4029 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 19 11:50:55.365465 master-0 kubenswrapper[4029]: I0319 11:50:55.362674 4029 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 19 11:50:55.365465 master-0 kubenswrapper[4029]: I0319 11:50:55.362680 4029 flags.go:64] FLAG: --storage-driver-password="root" Mar 19 11:50:55.365465 master-0 kubenswrapper[4029]: I0319 11:50:55.362686 4029 flags.go:64] FLAG: --storage-driver-secure="false" Mar 19 11:50:55.365465 master-0 kubenswrapper[4029]: I0319 11:50:55.362691 4029 flags.go:64] FLAG: --storage-driver-table="stats" Mar 19 11:50:55.365465 master-0 kubenswrapper[4029]: I0319 11:50:55.362697 4029 flags.go:64] FLAG: --storage-driver-user="root" Mar 19 11:50:55.365465 master-0 kubenswrapper[4029]: I0319 11:50:55.362702 4029 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 19 11:50:55.365465 master-0 kubenswrapper[4029]: I0319 11:50:55.362707 4029 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 19 11:50:55.365465 master-0 kubenswrapper[4029]: I0319 11:50:55.362712 4029 flags.go:64] FLAG: --system-cgroups="" Mar 19 11:50:55.365997 master-0 kubenswrapper[4029]: I0319 11:50:55.362720 4029 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 19 11:50:55.365997 master-0 kubenswrapper[4029]: I0319 11:50:55.362751 4029 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 19 11:50:55.365997 master-0 kubenswrapper[4029]: I0319 11:50:55.362757 4029 flags.go:64] FLAG: --tls-cert-file="" Mar 19 11:50:55.365997 master-0 kubenswrapper[4029]: I0319 11:50:55.362762 4029 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 19 11:50:55.365997 master-0 kubenswrapper[4029]: I0319 11:50:55.362768 4029 flags.go:64] FLAG: --tls-min-version="" Mar 19 11:50:55.365997 master-0 kubenswrapper[4029]: I0319 11:50:55.362773 4029 flags.go:64] FLAG: --tls-private-key-file="" Mar 19 11:50:55.365997 master-0 kubenswrapper[4029]: I0319 11:50:55.362779 4029 flags.go:64] FLAG: --topology-manager-policy="none" Mar 19 11:50:55.365997 master-0 kubenswrapper[4029]: I0319 11:50:55.362784 4029 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 19 11:50:55.365997 master-0 kubenswrapper[4029]: I0319 11:50:55.362789 4029 flags.go:64] FLAG: --topology-manager-scope="container" Mar 19 11:50:55.365997 master-0 kubenswrapper[4029]: I0319 11:50:55.362794 4029 flags.go:64] FLAG: --v="2" Mar 19 11:50:55.365997 master-0 kubenswrapper[4029]: I0319 11:50:55.362802 4029 flags.go:64] FLAG: --version="false" Mar 19 11:50:55.365997 master-0 kubenswrapper[4029]: I0319 11:50:55.362809 4029 flags.go:64] FLAG: --vmodule="" Mar 19 11:50:55.365997 master-0 kubenswrapper[4029]: I0319 11:50:55.362818 4029 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 19 11:50:55.365997 master-0 kubenswrapper[4029]: I0319 11:50:55.362824 4029 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 19 11:50:55.365997 master-0 kubenswrapper[4029]: W0319 11:50:55.362953 4029 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 11:50:55.365997 master-0 kubenswrapper[4029]: W0319 11:50:55.362959 4029 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 11:50:55.365997 master-0 kubenswrapper[4029]: W0319 11:50:55.362966 4029 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 11:50:55.365997 master-0 kubenswrapper[4029]: W0319 11:50:55.362973 4029 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 11:50:55.365997 master-0 kubenswrapper[4029]: W0319 11:50:55.362978 4029 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 11:50:55.365997 master-0 kubenswrapper[4029]: W0319 11:50:55.362983 4029 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 11:50:55.365997 master-0 kubenswrapper[4029]: W0319 11:50:55.362987 4029 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 11:50:55.365997 master-0 kubenswrapper[4029]: W0319 11:50:55.362992 4029 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 11:50:55.366432 master-0 kubenswrapper[4029]: W0319 11:50:55.362998 4029 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 11:50:55.366432 master-0 kubenswrapper[4029]: W0319 11:50:55.363002 4029 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 11:50:55.366432 master-0 kubenswrapper[4029]: W0319 11:50:55.363007 4029 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 11:50:55.366432 master-0 kubenswrapper[4029]: W0319 11:50:55.363011 4029 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 11:50:55.366432 master-0 kubenswrapper[4029]: W0319 11:50:55.363016 4029 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 11:50:55.366432 master-0 kubenswrapper[4029]: W0319 11:50:55.363020 4029 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 11:50:55.366432 master-0 kubenswrapper[4029]: W0319 11:50:55.363025 4029 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 11:50:55.366432 master-0 kubenswrapper[4029]: W0319 11:50:55.363030 4029 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 11:50:55.366432 master-0 kubenswrapper[4029]: W0319 11:50:55.363034 4029 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 11:50:55.366432 master-0 kubenswrapper[4029]: W0319 11:50:55.363038 4029 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 11:50:55.366432 master-0 kubenswrapper[4029]: W0319 11:50:55.363043 4029 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 11:50:55.366432 master-0 kubenswrapper[4029]: W0319 11:50:55.363047 4029 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 11:50:55.366432 master-0 kubenswrapper[4029]: W0319 11:50:55.363052 4029 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 11:50:55.366432 master-0 kubenswrapper[4029]: W0319 11:50:55.363056 4029 feature_gate.go:330] unrecognized feature gate: Example Mar 19 11:50:55.366432 master-0 kubenswrapper[4029]: W0319 11:50:55.363061 4029 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 11:50:55.366432 master-0 kubenswrapper[4029]: W0319 11:50:55.363066 4029 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 11:50:55.366432 master-0 kubenswrapper[4029]: W0319 11:50:55.363070 4029 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 11:50:55.366432 master-0 kubenswrapper[4029]: W0319 11:50:55.363075 4029 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 11:50:55.366432 master-0 kubenswrapper[4029]: W0319 11:50:55.363079 4029 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 11:50:55.366832 master-0 kubenswrapper[4029]: W0319 11:50:55.363084 4029 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 11:50:55.366832 master-0 kubenswrapper[4029]: W0319 11:50:55.363088 4029 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 11:50:55.366832 master-0 kubenswrapper[4029]: W0319 11:50:55.363092 4029 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 11:50:55.366832 master-0 kubenswrapper[4029]: W0319 11:50:55.363097 4029 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 11:50:55.366832 master-0 kubenswrapper[4029]: W0319 11:50:55.363101 4029 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 11:50:55.366832 master-0 kubenswrapper[4029]: W0319 11:50:55.363106 4029 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 11:50:55.366832 master-0 kubenswrapper[4029]: W0319 11:50:55.363110 4029 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 11:50:55.366832 master-0 kubenswrapper[4029]: W0319 11:50:55.363115 4029 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 11:50:55.366832 master-0 kubenswrapper[4029]: W0319 11:50:55.363120 4029 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 11:50:55.366832 master-0 kubenswrapper[4029]: W0319 11:50:55.363124 4029 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 11:50:55.366832 master-0 kubenswrapper[4029]: W0319 11:50:55.363128 4029 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 11:50:55.366832 master-0 kubenswrapper[4029]: W0319 11:50:55.363133 4029 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 11:50:55.366832 master-0 kubenswrapper[4029]: W0319 11:50:55.363137 4029 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 11:50:55.366832 master-0 kubenswrapper[4029]: W0319 11:50:55.363142 4029 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 11:50:55.366832 master-0 kubenswrapper[4029]: W0319 11:50:55.363146 4029 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 11:50:55.366832 master-0 kubenswrapper[4029]: W0319 11:50:55.363156 4029 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 11:50:55.366832 master-0 kubenswrapper[4029]: W0319 11:50:55.363161 4029 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 11:50:55.366832 master-0 kubenswrapper[4029]: W0319 11:50:55.363165 4029 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 11:50:55.366832 master-0 kubenswrapper[4029]: W0319 11:50:55.363170 4029 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 11:50:55.366832 master-0 kubenswrapper[4029]: W0319 11:50:55.363174 4029 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 11:50:55.367272 master-0 kubenswrapper[4029]: W0319 11:50:55.363179 4029 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 11:50:55.367272 master-0 kubenswrapper[4029]: W0319 11:50:55.363183 4029 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 11:50:55.367272 master-0 kubenswrapper[4029]: W0319 11:50:55.363187 4029 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 11:50:55.367272 master-0 kubenswrapper[4029]: W0319 11:50:55.363192 4029 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 11:50:55.367272 master-0 kubenswrapper[4029]: W0319 11:50:55.363196 4029 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 11:50:55.367272 master-0 kubenswrapper[4029]: W0319 11:50:55.363200 4029 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 11:50:55.367272 master-0 kubenswrapper[4029]: W0319 11:50:55.363204 4029 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 11:50:55.367272 master-0 kubenswrapper[4029]: W0319 11:50:55.363208 4029 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 11:50:55.367272 master-0 kubenswrapper[4029]: W0319 11:50:55.363213 4029 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 11:50:55.367272 master-0 kubenswrapper[4029]: W0319 11:50:55.363217 4029 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 11:50:55.367272 master-0 kubenswrapper[4029]: W0319 11:50:55.363221 4029 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 11:50:55.367272 master-0 kubenswrapper[4029]: W0319 11:50:55.363226 4029 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 11:50:55.367272 master-0 kubenswrapper[4029]: W0319 11:50:55.363231 4029 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 11:50:55.367272 master-0 kubenswrapper[4029]: W0319 11:50:55.363235 4029 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 11:50:55.367272 master-0 kubenswrapper[4029]: W0319 11:50:55.363239 4029 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 11:50:55.367272 master-0 kubenswrapper[4029]: W0319 11:50:55.363243 4029 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 11:50:55.367272 master-0 kubenswrapper[4029]: W0319 11:50:55.363248 4029 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 11:50:55.367272 master-0 kubenswrapper[4029]: W0319 11:50:55.363252 4029 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 11:50:55.367272 master-0 kubenswrapper[4029]: W0319 11:50:55.363257 4029 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 11:50:55.367272 master-0 kubenswrapper[4029]: W0319 11:50:55.363261 4029 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 11:50:55.367756 master-0 kubenswrapper[4029]: W0319 11:50:55.363269 4029 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 11:50:55.367756 master-0 kubenswrapper[4029]: W0319 11:50:55.363275 4029 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 11:50:55.367756 master-0 kubenswrapper[4029]: W0319 11:50:55.363281 4029 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 11:50:55.367756 master-0 kubenswrapper[4029]: W0319 11:50:55.363288 4029 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 11:50:55.367756 master-0 kubenswrapper[4029]: W0319 11:50:55.363295 4029 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 11:50:55.367756 master-0 kubenswrapper[4029]: I0319 11:50:55.363309 4029 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 11:50:55.375496 master-0 kubenswrapper[4029]: I0319 11:50:55.374903 4029 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 19 11:50:55.375547 master-0 kubenswrapper[4029]: I0319 11:50:55.375500 4029 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 19 11:50:55.375839 master-0 kubenswrapper[4029]: W0319 11:50:55.375700 4029 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 11:50:55.375839 master-0 kubenswrapper[4029]: W0319 11:50:55.375759 4029 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 11:50:55.375839 master-0 kubenswrapper[4029]: W0319 11:50:55.375770 4029 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 11:50:55.375839 master-0 kubenswrapper[4029]: W0319 11:50:55.375785 4029 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 11:50:55.375839 master-0 kubenswrapper[4029]: W0319 11:50:55.375800 4029 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 11:50:55.375839 master-0 kubenswrapper[4029]: W0319 11:50:55.375810 4029 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 11:50:55.375839 master-0 kubenswrapper[4029]: W0319 11:50:55.375819 4029 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 11:50:55.375839 master-0 kubenswrapper[4029]: W0319 11:50:55.375830 4029 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 11:50:55.375839 master-0 kubenswrapper[4029]: W0319 11:50:55.375840 4029 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 11:50:55.375839 master-0 kubenswrapper[4029]: W0319 11:50:55.375852 4029 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 11:50:55.376062 master-0 kubenswrapper[4029]: W0319 11:50:55.375865 4029 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 11:50:55.376062 master-0 kubenswrapper[4029]: W0319 11:50:55.375877 4029 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 11:50:55.376062 master-0 kubenswrapper[4029]: W0319 11:50:55.375887 4029 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 11:50:55.376062 master-0 kubenswrapper[4029]: W0319 11:50:55.375898 4029 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 11:50:55.376062 master-0 kubenswrapper[4029]: W0319 11:50:55.375907 4029 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 11:50:55.376062 master-0 kubenswrapper[4029]: W0319 11:50:55.375916 4029 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 11:50:55.376062 master-0 kubenswrapper[4029]: W0319 11:50:55.375926 4029 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 11:50:55.376062 master-0 kubenswrapper[4029]: W0319 11:50:55.375935 4029 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 11:50:55.376062 master-0 kubenswrapper[4029]: W0319 11:50:55.375943 4029 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 11:50:55.376062 master-0 kubenswrapper[4029]: W0319 11:50:55.375952 4029 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 11:50:55.376062 master-0 kubenswrapper[4029]: W0319 11:50:55.375960 4029 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 11:50:55.376062 master-0 kubenswrapper[4029]: W0319 11:50:55.375969 4029 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 11:50:55.376062 master-0 kubenswrapper[4029]: W0319 11:50:55.375977 4029 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 11:50:55.376062 master-0 kubenswrapper[4029]: W0319 11:50:55.375986 4029 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 11:50:55.376062 master-0 kubenswrapper[4029]: W0319 11:50:55.375995 4029 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 11:50:55.376062 master-0 kubenswrapper[4029]: W0319 11:50:55.376004 4029 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 11:50:55.376062 master-0 kubenswrapper[4029]: W0319 11:50:55.376012 4029 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 11:50:55.376062 master-0 kubenswrapper[4029]: W0319 11:50:55.376020 4029 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 11:50:55.376062 master-0 kubenswrapper[4029]: W0319 11:50:55.376028 4029 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 11:50:55.376062 master-0 kubenswrapper[4029]: W0319 11:50:55.376037 4029 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 11:50:55.376479 master-0 kubenswrapper[4029]: W0319 11:50:55.376046 4029 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 11:50:55.376479 master-0 kubenswrapper[4029]: W0319 11:50:55.376056 4029 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 11:50:55.376479 master-0 kubenswrapper[4029]: W0319 11:50:55.376064 4029 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 11:50:55.376479 master-0 kubenswrapper[4029]: W0319 11:50:55.376074 4029 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 11:50:55.376479 master-0 kubenswrapper[4029]: W0319 11:50:55.376086 4029 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 11:50:55.376479 master-0 kubenswrapper[4029]: W0319 11:50:55.376095 4029 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 11:50:55.376479 master-0 kubenswrapper[4029]: W0319 11:50:55.376104 4029 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 11:50:55.376479 master-0 kubenswrapper[4029]: W0319 11:50:55.376114 4029 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 11:50:55.376479 master-0 kubenswrapper[4029]: W0319 11:50:55.376122 4029 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 11:50:55.376479 master-0 kubenswrapper[4029]: W0319 11:50:55.376132 4029 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 11:50:55.376479 master-0 kubenswrapper[4029]: W0319 11:50:55.376140 4029 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 11:50:55.376479 master-0 kubenswrapper[4029]: W0319 11:50:55.376149 4029 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 11:50:55.376479 master-0 kubenswrapper[4029]: W0319 11:50:55.376158 4029 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 11:50:55.376479 master-0 kubenswrapper[4029]: W0319 11:50:55.376166 4029 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 11:50:55.376479 master-0 kubenswrapper[4029]: W0319 11:50:55.376175 4029 feature_gate.go:330] unrecognized feature gate: Example Mar 19 11:50:55.376479 master-0 kubenswrapper[4029]: W0319 11:50:55.376183 4029 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 11:50:55.376479 master-0 kubenswrapper[4029]: W0319 11:50:55.376192 4029 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 11:50:55.376479 master-0 kubenswrapper[4029]: W0319 11:50:55.376201 4029 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 11:50:55.376479 master-0 kubenswrapper[4029]: W0319 11:50:55.376209 4029 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 11:50:55.376479 master-0 kubenswrapper[4029]: W0319 11:50:55.376217 4029 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 11:50:55.377125 master-0 kubenswrapper[4029]: W0319 11:50:55.376226 4029 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 11:50:55.377125 master-0 kubenswrapper[4029]: W0319 11:50:55.376234 4029 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 11:50:55.377125 master-0 kubenswrapper[4029]: W0319 11:50:55.376243 4029 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 11:50:55.377125 master-0 kubenswrapper[4029]: W0319 11:50:55.376251 4029 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 11:50:55.377125 master-0 kubenswrapper[4029]: W0319 11:50:55.376260 4029 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 11:50:55.377125 master-0 kubenswrapper[4029]: W0319 11:50:55.376268 4029 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 11:50:55.377125 master-0 kubenswrapper[4029]: W0319 11:50:55.376277 4029 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 11:50:55.377125 master-0 kubenswrapper[4029]: W0319 11:50:55.376285 4029 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 11:50:55.377125 master-0 kubenswrapper[4029]: W0319 11:50:55.376294 4029 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 11:50:55.377125 master-0 kubenswrapper[4029]: W0319 11:50:55.376302 4029 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 11:50:55.377125 master-0 kubenswrapper[4029]: W0319 11:50:55.376311 4029 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 11:50:55.377125 master-0 kubenswrapper[4029]: W0319 11:50:55.376320 4029 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 11:50:55.377125 master-0 kubenswrapper[4029]: W0319 11:50:55.376329 4029 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 11:50:55.377125 master-0 kubenswrapper[4029]: W0319 11:50:55.376337 4029 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 11:50:55.377125 master-0 kubenswrapper[4029]: W0319 11:50:55.376349 4029 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 11:50:55.377125 master-0 kubenswrapper[4029]: W0319 11:50:55.376361 4029 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 11:50:55.377125 master-0 kubenswrapper[4029]: W0319 11:50:55.376373 4029 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 11:50:55.377125 master-0 kubenswrapper[4029]: W0319 11:50:55.376382 4029 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 11:50:55.377125 master-0 kubenswrapper[4029]: W0319 11:50:55.376391 4029 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 11:50:55.377551 master-0 kubenswrapper[4029]: W0319 11:50:55.376400 4029 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 11:50:55.377551 master-0 kubenswrapper[4029]: W0319 11:50:55.376411 4029 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 11:50:55.377551 master-0 kubenswrapper[4029]: W0319 11:50:55.376420 4029 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 11:50:55.377551 master-0 kubenswrapper[4029]: I0319 11:50:55.376434 4029 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 11:50:55.377551 master-0 kubenswrapper[4029]: W0319 11:50:55.376686 4029 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 11:50:55.377551 master-0 kubenswrapper[4029]: W0319 11:50:55.376701 4029 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 11:50:55.377551 master-0 kubenswrapper[4029]: W0319 11:50:55.376711 4029 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 11:50:55.377551 master-0 kubenswrapper[4029]: W0319 11:50:55.376720 4029 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 11:50:55.377551 master-0 kubenswrapper[4029]: W0319 11:50:55.376755 4029 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 11:50:55.377551 master-0 kubenswrapper[4029]: W0319 11:50:55.376764 4029 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 11:50:55.377551 master-0 kubenswrapper[4029]: W0319 11:50:55.376773 4029 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 11:50:55.377551 master-0 kubenswrapper[4029]: W0319 11:50:55.376783 4029 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 11:50:55.377551 master-0 kubenswrapper[4029]: W0319 11:50:55.376792 4029 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 11:50:55.377551 master-0 kubenswrapper[4029]: W0319 11:50:55.376800 4029 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 11:50:55.377551 master-0 kubenswrapper[4029]: W0319 11:50:55.376808 4029 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 11:50:55.377888 master-0 kubenswrapper[4029]: W0319 11:50:55.376817 4029 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 11:50:55.377888 master-0 kubenswrapper[4029]: W0319 11:50:55.376825 4029 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 11:50:55.377888 master-0 kubenswrapper[4029]: W0319 11:50:55.376834 4029 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 11:50:55.377888 master-0 kubenswrapper[4029]: W0319 11:50:55.376843 4029 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 11:50:55.377888 master-0 kubenswrapper[4029]: W0319 11:50:55.376852 4029 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 11:50:55.377888 master-0 kubenswrapper[4029]: W0319 11:50:55.376860 4029 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 11:50:55.377888 master-0 kubenswrapper[4029]: W0319 11:50:55.376868 4029 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 11:50:55.377888 master-0 kubenswrapper[4029]: W0319 11:50:55.376877 4029 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 11:50:55.377888 master-0 kubenswrapper[4029]: W0319 11:50:55.376886 4029 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 11:50:55.377888 master-0 kubenswrapper[4029]: W0319 11:50:55.376894 4029 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 11:50:55.377888 master-0 kubenswrapper[4029]: W0319 11:50:55.376902 4029 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 11:50:55.377888 master-0 kubenswrapper[4029]: W0319 11:50:55.376915 4029 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 11:50:55.377888 master-0 kubenswrapper[4029]: W0319 11:50:55.376929 4029 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 11:50:55.377888 master-0 kubenswrapper[4029]: W0319 11:50:55.376938 4029 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 11:50:55.377888 master-0 kubenswrapper[4029]: W0319 11:50:55.376948 4029 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 11:50:55.377888 master-0 kubenswrapper[4029]: W0319 11:50:55.376957 4029 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 11:50:55.377888 master-0 kubenswrapper[4029]: W0319 11:50:55.376967 4029 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 11:50:55.377888 master-0 kubenswrapper[4029]: W0319 11:50:55.376977 4029 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 11:50:55.377888 master-0 kubenswrapper[4029]: W0319 11:50:55.376988 4029 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 11:50:55.378324 master-0 kubenswrapper[4029]: W0319 11:50:55.376998 4029 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 11:50:55.378324 master-0 kubenswrapper[4029]: W0319 11:50:55.377008 4029 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 11:50:55.378324 master-0 kubenswrapper[4029]: W0319 11:50:55.377017 4029 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 11:50:55.378324 master-0 kubenswrapper[4029]: W0319 11:50:55.377028 4029 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 11:50:55.378324 master-0 kubenswrapper[4029]: W0319 11:50:55.377037 4029 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 11:50:55.378324 master-0 kubenswrapper[4029]: W0319 11:50:55.377047 4029 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 11:50:55.378324 master-0 kubenswrapper[4029]: W0319 11:50:55.377056 4029 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 11:50:55.378324 master-0 kubenswrapper[4029]: W0319 11:50:55.377066 4029 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 11:50:55.378324 master-0 kubenswrapper[4029]: W0319 11:50:55.377074 4029 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 11:50:55.378324 master-0 kubenswrapper[4029]: W0319 11:50:55.377084 4029 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 11:50:55.378324 master-0 kubenswrapper[4029]: W0319 11:50:55.377093 4029 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 11:50:55.378324 master-0 kubenswrapper[4029]: W0319 11:50:55.377104 4029 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 11:50:55.378324 master-0 kubenswrapper[4029]: W0319 11:50:55.377114 4029 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 11:50:55.378324 master-0 kubenswrapper[4029]: W0319 11:50:55.377123 4029 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 11:50:55.378324 master-0 kubenswrapper[4029]: W0319 11:50:55.377133 4029 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 11:50:55.378324 master-0 kubenswrapper[4029]: W0319 11:50:55.377142 4029 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 11:50:55.378324 master-0 kubenswrapper[4029]: W0319 11:50:55.377150 4029 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 11:50:55.378324 master-0 kubenswrapper[4029]: W0319 11:50:55.377159 4029 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 11:50:55.378324 master-0 kubenswrapper[4029]: W0319 11:50:55.377168 4029 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 11:50:55.378324 master-0 kubenswrapper[4029]: W0319 11:50:55.377177 4029 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 11:50:55.378757 master-0 kubenswrapper[4029]: W0319 11:50:55.377185 4029 feature_gate.go:330] unrecognized feature gate: Example Mar 19 11:50:55.378757 master-0 kubenswrapper[4029]: W0319 11:50:55.377194 4029 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 11:50:55.378757 master-0 kubenswrapper[4029]: W0319 11:50:55.377203 4029 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 11:50:55.378757 master-0 kubenswrapper[4029]: W0319 11:50:55.377211 4029 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 11:50:55.378757 master-0 kubenswrapper[4029]: W0319 11:50:55.377223 4029 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 11:50:55.378757 master-0 kubenswrapper[4029]: W0319 11:50:55.377232 4029 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 11:50:55.378757 master-0 kubenswrapper[4029]: W0319 11:50:55.377240 4029 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 11:50:55.378757 master-0 kubenswrapper[4029]: W0319 11:50:55.377249 4029 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 11:50:55.378757 master-0 kubenswrapper[4029]: W0319 11:50:55.377257 4029 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 11:50:55.378757 master-0 kubenswrapper[4029]: W0319 11:50:55.377269 4029 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 11:50:55.378757 master-0 kubenswrapper[4029]: W0319 11:50:55.377280 4029 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 11:50:55.378757 master-0 kubenswrapper[4029]: W0319 11:50:55.377289 4029 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 11:50:55.378757 master-0 kubenswrapper[4029]: W0319 11:50:55.377298 4029 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 11:50:55.378757 master-0 kubenswrapper[4029]: W0319 11:50:55.377307 4029 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 11:50:55.378757 master-0 kubenswrapper[4029]: W0319 11:50:55.377316 4029 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 11:50:55.378757 master-0 kubenswrapper[4029]: W0319 11:50:55.377325 4029 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 11:50:55.378757 master-0 kubenswrapper[4029]: W0319 11:50:55.377333 4029 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 11:50:55.378757 master-0 kubenswrapper[4029]: W0319 11:50:55.377342 4029 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 11:50:55.378757 master-0 kubenswrapper[4029]: W0319 11:50:55.377350 4029 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 11:50:55.378757 master-0 kubenswrapper[4029]: W0319 11:50:55.377380 4029 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 11:50:55.379193 master-0 kubenswrapper[4029]: W0319 11:50:55.377389 4029 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 11:50:55.379193 master-0 kubenswrapper[4029]: W0319 11:50:55.377398 4029 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 11:50:55.379193 master-0 kubenswrapper[4029]: I0319 11:50:55.377412 4029 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 11:50:55.379193 master-0 kubenswrapper[4029]: I0319 11:50:55.377897 4029 server.go:940] "Client rotation is on, will bootstrap in background" Mar 19 11:50:55.384194 master-0 kubenswrapper[4029]: I0319 11:50:55.384131 4029 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 19 11:50:55.385708 master-0 kubenswrapper[4029]: I0319 11:50:55.385662 4029 server.go:997] "Starting client certificate rotation" Mar 19 11:50:55.385843 master-0 kubenswrapper[4029]: I0319 11:50:55.385717 4029 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 19 11:50:55.386091 master-0 kubenswrapper[4029]: I0319 11:50:55.386033 4029 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 11:50:55.415760 master-0 kubenswrapper[4029]: I0319 11:50:55.415657 4029 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 11:50:55.422084 master-0 kubenswrapper[4029]: I0319 11:50:55.422024 4029 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 11:50:55.423314 master-0 kubenswrapper[4029]: E0319 11:50:55.423234 4029 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:50:55.445187 master-0 kubenswrapper[4029]: I0319 11:50:55.445108 4029 log.go:25] "Validated CRI v1 runtime API" Mar 19 11:50:55.451996 master-0 kubenswrapper[4029]: I0319 11:50:55.451957 4029 log.go:25] "Validated CRI v1 image API" Mar 19 11:50:55.454438 master-0 kubenswrapper[4029]: I0319 11:50:55.454389 4029 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 19 11:50:55.459482 master-0 kubenswrapper[4029]: I0319 11:50:55.459431 4029 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/vda2 84bbc972-b2a6-48d9-8e4d-c9ff50fad0b0:/dev/vda3 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Mar 19 11:50:55.459534 master-0 kubenswrapper[4029]: I0319 11:50:55.459470 4029 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0}] Mar 19 11:50:55.478946 master-0 kubenswrapper[4029]: I0319 11:50:55.478557 4029 manager.go:217] Machine: {Timestamp:2026-03-19 11:50:55.476467436 +0000 UTC m=+0.553344043 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:7514b5d6ada747ba9a1e5c7e73d4e6d3 SystemUUID:7514b5d6-ada7-47ba-9a1e-5c7e73d4e6d3 BootID:bab7eb38-7ae5-4f9e-8147-39f837056abe Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:3b:cf:f0 Speed:-1 Mtu:9000} {Name:ovs-system MacAddress:0e:7e:5d:df:5c:77 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 19 11:50:55.478946 master-0 kubenswrapper[4029]: I0319 11:50:55.478896 4029 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 19 11:50:55.479161 master-0 kubenswrapper[4029]: I0319 11:50:55.479108 4029 manager.go:233] Version: {KernelVersion:5.14.0-427.113.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202603021444-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 19 11:50:55.480769 master-0 kubenswrapper[4029]: I0319 11:50:55.480710 4029 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 19 11:50:55.481045 master-0 kubenswrapper[4029]: I0319 11:50:55.480987 4029 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 19 11:50:55.481352 master-0 kubenswrapper[4029]: I0319 11:50:55.481039 4029 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 19 11:50:55.481408 master-0 kubenswrapper[4029]: I0319 11:50:55.481368 4029 topology_manager.go:138] "Creating topology manager with none policy" Mar 19 11:50:55.481408 master-0 kubenswrapper[4029]: I0319 11:50:55.481382 4029 container_manager_linux.go:303] "Creating device plugin manager" Mar 19 11:50:55.481509 master-0 kubenswrapper[4029]: I0319 11:50:55.481481 4029 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 11:50:55.481540 master-0 kubenswrapper[4029]: I0319 11:50:55.481519 4029 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 11:50:55.481753 master-0 kubenswrapper[4029]: I0319 11:50:55.481696 4029 state_mem.go:36] "Initialized new in-memory state store" Mar 19 11:50:55.481930 master-0 kubenswrapper[4029]: I0319 11:50:55.481901 4029 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 19 11:50:55.486679 master-0 kubenswrapper[4029]: I0319 11:50:55.486637 4029 kubelet.go:418] "Attempting to sync node with API server" Mar 19 11:50:55.486679 master-0 kubenswrapper[4029]: I0319 11:50:55.486676 4029 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 19 11:50:55.486819 master-0 kubenswrapper[4029]: I0319 11:50:55.486788 4029 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 19 11:50:55.486852 master-0 kubenswrapper[4029]: I0319 11:50:55.486843 4029 kubelet.go:324] "Adding apiserver pod source" Mar 19 11:50:55.486879 master-0 kubenswrapper[4029]: I0319 11:50:55.486864 4029 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 19 11:50:55.493296 master-0 kubenswrapper[4029]: W0319 11:50:55.493166 4029 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:50:55.493418 master-0 kubenswrapper[4029]: E0319 11:50:55.493375 4029 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:50:55.493806 master-0 kubenswrapper[4029]: W0319 11:50:55.493675 4029 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:50:55.493862 master-0 kubenswrapper[4029]: E0319 11:50:55.493833 4029 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:50:55.497324 master-0 kubenswrapper[4029]: I0319 11:50:55.497282 4029 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 19 11:50:55.500338 master-0 kubenswrapper[4029]: I0319 11:50:55.500308 4029 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 19 11:50:55.500605 master-0 kubenswrapper[4029]: I0319 11:50:55.500578 4029 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 19 11:50:55.500641 master-0 kubenswrapper[4029]: I0319 11:50:55.500607 4029 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 19 11:50:55.500641 master-0 kubenswrapper[4029]: I0319 11:50:55.500618 4029 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 19 11:50:55.500641 master-0 kubenswrapper[4029]: I0319 11:50:55.500627 4029 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 19 11:50:55.500641 master-0 kubenswrapper[4029]: I0319 11:50:55.500636 4029 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 19 11:50:55.500741 master-0 kubenswrapper[4029]: I0319 11:50:55.500646 4029 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 19 11:50:55.500741 master-0 kubenswrapper[4029]: I0319 11:50:55.500655 4029 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 19 11:50:55.500741 master-0 kubenswrapper[4029]: I0319 11:50:55.500664 4029 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 19 11:50:55.500741 master-0 kubenswrapper[4029]: I0319 11:50:55.500674 4029 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 19 11:50:55.500741 master-0 kubenswrapper[4029]: I0319 11:50:55.500683 4029 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 19 11:50:55.500741 master-0 kubenswrapper[4029]: I0319 11:50:55.500697 4029 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 19 11:50:55.500877 master-0 kubenswrapper[4029]: I0319 11:50:55.500833 4029 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 19 11:50:55.502896 master-0 kubenswrapper[4029]: I0319 11:50:55.502868 4029 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 19 11:50:55.503452 master-0 kubenswrapper[4029]: I0319 11:50:55.503426 4029 server.go:1280] "Started kubelet" Mar 19 11:50:55.504577 master-0 kubenswrapper[4029]: I0319 11:50:55.504440 4029 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 19 11:50:55.504816 master-0 kubenswrapper[4029]: I0319 11:50:55.504439 4029 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 19 11:50:55.504816 master-0 kubenswrapper[4029]: I0319 11:50:55.504652 4029 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 19 11:50:55.505103 master-0 kubenswrapper[4029]: I0319 11:50:55.505083 4029 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 19 11:50:55.505466 master-0 kubenswrapper[4029]: I0319 11:50:55.505431 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:50:55.505849 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 19 11:50:55.506125 master-0 kubenswrapper[4029]: I0319 11:50:55.506066 4029 server.go:449] "Adding debug handlers to kubelet server" Mar 19 11:50:55.509764 master-0 kubenswrapper[4029]: I0319 11:50:55.509697 4029 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 19 11:50:55.509862 master-0 kubenswrapper[4029]: I0319 11:50:55.509831 4029 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 19 11:50:55.510105 master-0 kubenswrapper[4029]: I0319 11:50:55.510012 4029 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 19 11:50:55.510105 master-0 kubenswrapper[4029]: I0319 11:50:55.510094 4029 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 19 11:50:55.510175 master-0 kubenswrapper[4029]: I0319 11:50:55.510059 4029 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 19 11:50:55.510175 master-0 kubenswrapper[4029]: E0319 11:50:55.509983 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:50:55.510574 master-0 kubenswrapper[4029]: E0319 11:50:55.509480 4029 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189e3bcd24dd952d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.503389997 +0000 UTC m=+0.580266584,LastTimestamp:2026-03-19 11:50:55.503389997 +0000 UTC m=+0.580266584,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:50:55.510904 master-0 kubenswrapper[4029]: W0319 11:50:55.510851 4029 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:50:55.511006 master-0 kubenswrapper[4029]: E0319 11:50:55.510985 4029 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:50:55.511057 master-0 kubenswrapper[4029]: I0319 11:50:55.510935 4029 reconstruct.go:97] "Volume reconstruction finished" Mar 19 11:50:55.511106 master-0 kubenswrapper[4029]: I0319 11:50:55.511096 4029 reconciler.go:26] "Reconciler: start to sync state" Mar 19 11:50:55.511253 master-0 kubenswrapper[4029]: E0319 11:50:55.510853 4029 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 19 11:50:55.514084 master-0 kubenswrapper[4029]: I0319 11:50:55.514056 4029 factory.go:153] Registering CRI-O factory Mar 19 11:50:55.514133 master-0 kubenswrapper[4029]: I0319 11:50:55.514092 4029 factory.go:221] Registration of the crio container factory successfully Mar 19 11:50:55.514189 master-0 kubenswrapper[4029]: I0319 11:50:55.514168 4029 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 19 11:50:55.514222 master-0 kubenswrapper[4029]: I0319 11:50:55.514190 4029 factory.go:55] Registering systemd factory Mar 19 11:50:55.514222 master-0 kubenswrapper[4029]: I0319 11:50:55.514201 4029 factory.go:221] Registration of the systemd container factory successfully Mar 19 11:50:55.514267 master-0 kubenswrapper[4029]: I0319 11:50:55.514260 4029 factory.go:103] Registering Raw factory Mar 19 11:50:55.514302 master-0 kubenswrapper[4029]: I0319 11:50:55.514282 4029 manager.go:1196] Started watching for new ooms in manager Mar 19 11:50:55.514426 master-0 kubenswrapper[4029]: E0319 11:50:55.514408 4029 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 19 11:50:55.516195 master-0 kubenswrapper[4029]: I0319 11:50:55.516146 4029 manager.go:319] Starting recovery of all containers Mar 19 11:50:55.532438 master-0 kubenswrapper[4029]: I0319 11:50:55.532414 4029 manager.go:324] Recovery completed Mar 19 11:50:55.541206 master-0 kubenswrapper[4029]: I0319 11:50:55.541177 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:50:55.543406 master-0 kubenswrapper[4029]: I0319 11:50:55.543312 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:50:55.543464 master-0 kubenswrapper[4029]: I0319 11:50:55.543417 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:50:55.543464 master-0 kubenswrapper[4029]: I0319 11:50:55.543434 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:50:55.544151 master-0 kubenswrapper[4029]: I0319 11:50:55.544124 4029 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 19 11:50:55.544195 master-0 kubenswrapper[4029]: I0319 11:50:55.544149 4029 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 19 11:50:55.544195 master-0 kubenswrapper[4029]: I0319 11:50:55.544176 4029 state_mem.go:36] "Initialized new in-memory state store" Mar 19 11:50:55.550553 master-0 kubenswrapper[4029]: I0319 11:50:55.550509 4029 policy_none.go:49] "None policy: Start" Mar 19 11:50:55.551934 master-0 kubenswrapper[4029]: I0319 11:50:55.551881 4029 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 19 11:50:55.551934 master-0 kubenswrapper[4029]: I0319 11:50:55.551914 4029 state_mem.go:35] "Initializing new in-memory state store" Mar 19 11:50:55.610470 master-0 kubenswrapper[4029]: E0319 11:50:55.610429 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:50:55.628177 master-0 kubenswrapper[4029]: I0319 11:50:55.628099 4029 manager.go:334] "Starting Device Plugin manager" Mar 19 11:50:55.659941 master-0 kubenswrapper[4029]: I0319 11:50:55.628199 4029 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 19 11:50:55.659941 master-0 kubenswrapper[4029]: I0319 11:50:55.628219 4029 server.go:79] "Starting device plugin registration server" Mar 19 11:50:55.659941 master-0 kubenswrapper[4029]: I0319 11:50:55.629113 4029 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 19 11:50:55.659941 master-0 kubenswrapper[4029]: I0319 11:50:55.629135 4029 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 19 11:50:55.659941 master-0 kubenswrapper[4029]: E0319 11:50:55.631851 4029 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 11:50:55.659941 master-0 kubenswrapper[4029]: I0319 11:50:55.633652 4029 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 19 11:50:55.659941 master-0 kubenswrapper[4029]: I0319 11:50:55.633984 4029 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 19 11:50:55.659941 master-0 kubenswrapper[4029]: I0319 11:50:55.634003 4029 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 19 11:50:55.659941 master-0 kubenswrapper[4029]: I0319 11:50:55.648231 4029 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 19 11:50:55.659941 master-0 kubenswrapper[4029]: I0319 11:50:55.651298 4029 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 19 11:50:55.659941 master-0 kubenswrapper[4029]: I0319 11:50:55.651382 4029 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 19 11:50:55.659941 master-0 kubenswrapper[4029]: I0319 11:50:55.651425 4029 kubelet.go:2335] "Starting kubelet main sync loop" Mar 19 11:50:55.659941 master-0 kubenswrapper[4029]: E0319 11:50:55.651697 4029 kubelet.go:2359] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 19 11:50:55.659941 master-0 kubenswrapper[4029]: W0319 11:50:55.658204 4029 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:50:55.659941 master-0 kubenswrapper[4029]: E0319 11:50:55.658278 4029 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:50:55.712767 master-0 kubenswrapper[4029]: E0319 11:50:55.712690 4029 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 19 11:50:55.729797 master-0 kubenswrapper[4029]: I0319 11:50:55.729755 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:50:55.730687 master-0 kubenswrapper[4029]: I0319 11:50:55.730660 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:50:55.730756 master-0 kubenswrapper[4029]: I0319 11:50:55.730687 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:50:55.730756 master-0 kubenswrapper[4029]: I0319 11:50:55.730697 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:50:55.730756 master-0 kubenswrapper[4029]: I0319 11:50:55.730720 4029 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:50:55.731320 master-0 kubenswrapper[4029]: E0319 11:50:55.731275 4029 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 11:50:55.752549 master-0 kubenswrapper[4029]: I0319 11:50:55.752463 4029 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0"] Mar 19 11:50:55.752549 master-0 kubenswrapper[4029]: I0319 11:50:55.752560 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:50:55.753552 master-0 kubenswrapper[4029]: I0319 11:50:55.753485 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:50:55.753552 master-0 kubenswrapper[4029]: I0319 11:50:55.753529 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:50:55.753552 master-0 kubenswrapper[4029]: I0319 11:50:55.753538 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:50:55.753869 master-0 kubenswrapper[4029]: I0319 11:50:55.753678 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:50:55.754263 master-0 kubenswrapper[4029]: I0319 11:50:55.754190 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:50:55.754327 master-0 kubenswrapper[4029]: I0319 11:50:55.754290 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:50:55.754620 master-0 kubenswrapper[4029]: I0319 11:50:55.754591 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:50:55.754667 master-0 kubenswrapper[4029]: I0319 11:50:55.754625 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:50:55.754667 master-0 kubenswrapper[4029]: I0319 11:50:55.754635 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:50:55.754798 master-0 kubenswrapper[4029]: I0319 11:50:55.754773 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:50:55.755136 master-0 kubenswrapper[4029]: I0319 11:50:55.755083 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:50:55.755596 master-0 kubenswrapper[4029]: I0319 11:50:55.755556 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:50:55.756191 master-0 kubenswrapper[4029]: I0319 11:50:55.756016 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:50:55.756191 master-0 kubenswrapper[4029]: I0319 11:50:55.756055 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:50:55.756191 master-0 kubenswrapper[4029]: I0319 11:50:55.756069 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:50:55.756649 master-0 kubenswrapper[4029]: I0319 11:50:55.756608 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:50:55.756649 master-0 kubenswrapper[4029]: I0319 11:50:55.756641 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:50:55.756760 master-0 kubenswrapper[4029]: I0319 11:50:55.756664 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:50:55.756822 master-0 kubenswrapper[4029]: I0319 11:50:55.756792 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:50:55.756822 master-0 kubenswrapper[4029]: I0319 11:50:55.756808 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:50:55.756822 master-0 kubenswrapper[4029]: I0319 11:50:55.756820 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:50:55.757064 master-0 kubenswrapper[4029]: I0319 11:50:55.757031 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:50:55.757358 master-0 kubenswrapper[4029]: I0319 11:50:55.757329 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:50:55.757423 master-0 kubenswrapper[4029]: I0319 11:50:55.757377 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:50:55.757810 master-0 kubenswrapper[4029]: I0319 11:50:55.757785 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:50:55.757810 master-0 kubenswrapper[4029]: I0319 11:50:55.757809 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:50:55.757913 master-0 kubenswrapper[4029]: I0319 11:50:55.757820 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:50:55.758127 master-0 kubenswrapper[4029]: I0319 11:50:55.758085 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:50:55.758180 master-0 kubenswrapper[4029]: I0319 11:50:55.758134 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:50:55.758180 master-0 kubenswrapper[4029]: I0319 11:50:55.758150 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:50:55.758254 master-0 kubenswrapper[4029]: I0319 11:50:55.758095 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:50:55.758297 master-0 kubenswrapper[4029]: I0319 11:50:55.758258 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:50:55.758343 master-0 kubenswrapper[4029]: I0319 11:50:55.758299 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:50:55.759717 master-0 kubenswrapper[4029]: I0319 11:50:55.759677 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:50:55.759717 master-0 kubenswrapper[4029]: I0319 11:50:55.759721 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:50:55.759841 master-0 kubenswrapper[4029]: I0319 11:50:55.759753 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:50:55.760060 master-0 kubenswrapper[4029]: I0319 11:50:55.760025 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:50:55.760060 master-0 kubenswrapper[4029]: I0319 11:50:55.760056 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:50:55.760502 master-0 kubenswrapper[4029]: I0319 11:50:55.760399 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:50:55.760564 master-0 kubenswrapper[4029]: I0319 11:50:55.760544 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:50:55.760606 master-0 kubenswrapper[4029]: I0319 11:50:55.760568 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:50:55.761419 master-0 kubenswrapper[4029]: I0319 11:50:55.761373 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:50:55.761488 master-0 kubenswrapper[4029]: I0319 11:50:55.761419 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:50:55.761488 master-0 kubenswrapper[4029]: I0319 11:50:55.761437 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:50:55.812670 master-0 kubenswrapper[4029]: I0319 11:50:55.812585 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:50:55.812670 master-0 kubenswrapper[4029]: I0319 11:50:55.812660 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:50:55.812940 master-0 kubenswrapper[4029]: I0319 11:50:55.812745 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:50:55.812940 master-0 kubenswrapper[4029]: I0319 11:50:55.812784 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:50:55.812940 master-0 kubenswrapper[4029]: I0319 11:50:55.812813 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:50:55.812940 master-0 kubenswrapper[4029]: I0319 11:50:55.812842 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:50:55.913590 master-0 kubenswrapper[4029]: I0319 11:50:55.913495 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:50:55.913590 master-0 kubenswrapper[4029]: I0319 11:50:55.913570 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:50:55.913899 master-0 kubenswrapper[4029]: I0319 11:50:55.913643 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:50:55.913899 master-0 kubenswrapper[4029]: I0319 11:50:55.913691 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:50:55.913899 master-0 kubenswrapper[4029]: I0319 11:50:55.913825 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:50:55.913899 master-0 kubenswrapper[4029]: I0319 11:50:55.913858 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:50:55.913899 master-0 kubenswrapper[4029]: I0319 11:50:55.913892 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:50:55.914071 master-0 kubenswrapper[4029]: I0319 11:50:55.913927 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:50:55.914071 master-0 kubenswrapper[4029]: I0319 11:50:55.913995 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:50:55.914071 master-0 kubenswrapper[4029]: I0319 11:50:55.914041 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:50:55.914071 master-0 kubenswrapper[4029]: I0319 11:50:55.914054 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:50:55.914204 master-0 kubenswrapper[4029]: I0319 11:50:55.914108 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:50:55.914204 master-0 kubenswrapper[4029]: I0319 11:50:55.914157 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:50:55.914284 master-0 kubenswrapper[4029]: I0319 11:50:55.914202 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:50:55.914284 master-0 kubenswrapper[4029]: I0319 11:50:55.914230 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:50:55.914284 master-0 kubenswrapper[4029]: I0319 11:50:55.914127 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:50:55.914284 master-0 kubenswrapper[4029]: I0319 11:50:55.914235 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:50:55.914434 master-0 kubenswrapper[4029]: I0319 11:50:55.914295 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:50:55.914434 master-0 kubenswrapper[4029]: I0319 11:50:55.914291 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:50:55.914434 master-0 kubenswrapper[4029]: I0319 11:50:55.914324 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:50:55.914434 master-0 kubenswrapper[4029]: I0319 11:50:55.914335 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:50:55.914434 master-0 kubenswrapper[4029]: I0319 11:50:55.914365 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:50:55.914434 master-0 kubenswrapper[4029]: I0319 11:50:55.914397 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:50:55.931722 master-0 kubenswrapper[4029]: I0319 11:50:55.931666 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:50:55.932886 master-0 kubenswrapper[4029]: I0319 11:50:55.932851 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:50:55.932948 master-0 kubenswrapper[4029]: I0319 11:50:55.932892 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:50:55.932948 master-0 kubenswrapper[4029]: I0319 11:50:55.932936 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:50:55.933020 master-0 kubenswrapper[4029]: I0319 11:50:55.932992 4029 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:50:55.933861 master-0 kubenswrapper[4029]: E0319 11:50:55.933828 4029 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 11:50:56.014861 master-0 kubenswrapper[4029]: I0319 11:50:56.014762 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:50:56.015105 master-0 kubenswrapper[4029]: I0319 11:50:56.014885 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:50:56.015105 master-0 kubenswrapper[4029]: I0319 11:50:56.014980 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:50:56.015105 master-0 kubenswrapper[4029]: I0319 11:50:56.015038 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:50:56.015105 master-0 kubenswrapper[4029]: I0319 11:50:56.015089 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:50:56.015262 master-0 kubenswrapper[4029]: I0319 11:50:56.015148 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:50:56.015262 master-0 kubenswrapper[4029]: I0319 11:50:56.015153 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:50:56.015262 master-0 kubenswrapper[4029]: I0319 11:50:56.015187 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:50:56.015262 master-0 kubenswrapper[4029]: I0319 11:50:56.015194 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:50:56.015262 master-0 kubenswrapper[4029]: I0319 11:50:56.015243 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:50:56.015437 master-0 kubenswrapper[4029]: I0319 11:50:56.015258 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:50:56.015437 master-0 kubenswrapper[4029]: I0319 11:50:56.015299 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:50:56.015437 master-0 kubenswrapper[4029]: I0319 11:50:56.015257 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:50:56.015437 master-0 kubenswrapper[4029]: I0319 11:50:56.015315 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:50:56.015581 master-0 kubenswrapper[4029]: I0319 11:50:56.015404 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:50:56.015581 master-0 kubenswrapper[4029]: I0319 11:50:56.015473 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:50:56.015581 master-0 kubenswrapper[4029]: I0319 11:50:56.015453 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:50:56.015581 master-0 kubenswrapper[4029]: I0319 11:50:56.015505 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:50:56.015581 master-0 kubenswrapper[4029]: I0319 11:50:56.015529 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:50:56.015581 master-0 kubenswrapper[4029]: I0319 11:50:56.015533 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:50:56.015802 master-0 kubenswrapper[4029]: I0319 11:50:56.015667 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:50:56.015802 master-0 kubenswrapper[4029]: I0319 11:50:56.015713 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:50:56.097200 master-0 kubenswrapper[4029]: I0319 11:50:56.097088 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:50:56.114307 master-0 kubenswrapper[4029]: E0319 11:50:56.114205 4029 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 19 11:50:56.117561 master-0 kubenswrapper[4029]: I0319 11:50:56.117504 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:50:56.142209 master-0 kubenswrapper[4029]: I0319 11:50:56.142088 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:50:56.165775 master-0 kubenswrapper[4029]: I0319 11:50:56.165673 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:50:56.171176 master-0 kubenswrapper[4029]: I0319 11:50:56.171107 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:50:56.334044 master-0 kubenswrapper[4029]: I0319 11:50:56.333969 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:50:56.335013 master-0 kubenswrapper[4029]: I0319 11:50:56.334968 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:50:56.335013 master-0 kubenswrapper[4029]: I0319 11:50:56.335013 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:50:56.335131 master-0 kubenswrapper[4029]: I0319 11:50:56.335030 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:50:56.335131 master-0 kubenswrapper[4029]: I0319 11:50:56.335095 4029 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:50:56.336091 master-0 kubenswrapper[4029]: E0319 11:50:56.336043 4029 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 11:50:56.507328 master-0 kubenswrapper[4029]: I0319 11:50:56.507246 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:50:56.579869 master-0 kubenswrapper[4029]: W0319 11:50:56.579773 4029 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:50:56.579869 master-0 kubenswrapper[4029]: E0319 11:50:56.579849 4029 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:50:56.612170 master-0 kubenswrapper[4029]: W0319 11:50:56.612094 4029 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:50:56.612170 master-0 kubenswrapper[4029]: E0319 11:50:56.612154 4029 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:50:56.727678 master-0 kubenswrapper[4029]: W0319 11:50:56.727583 4029 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc83737980b9ee109184b1d78e942cf36.slice/crio-3ca871a2e4c187593092b1e6a4a9637d7435e4628b01bcadfea7c6a9560eeb21 WatchSource:0}: Error finding container 3ca871a2e4c187593092b1e6a4a9637d7435e4628b01bcadfea7c6a9560eeb21: Status 404 returned error can't find the container with id 3ca871a2e4c187593092b1e6a4a9637d7435e4628b01bcadfea7c6a9560eeb21 Mar 19 11:50:56.732169 master-0 kubenswrapper[4029]: I0319 11:50:56.732137 4029 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 11:50:56.784825 master-0 kubenswrapper[4029]: W0319 11:50:56.784762 4029 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1249822f86f23526277d165c0d5d3c19.slice/crio-c7fce19a33a5dd46ce06e3ec2001f8aae0d2c521be7c2647e59448b0833408c9 WatchSource:0}: Error finding container c7fce19a33a5dd46ce06e3ec2001f8aae0d2c521be7c2647e59448b0833408c9: Status 404 returned error can't find the container with id c7fce19a33a5dd46ce06e3ec2001f8aae0d2c521be7c2647e59448b0833408c9 Mar 19 11:50:56.811846 master-0 kubenswrapper[4029]: W0319 11:50:56.811782 4029 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46f265536aba6292ead501bc9b49f327.slice/crio-c21d5cdcf33dc5445d398db5efae2e61668498b313fd2a8200f2011b9857d1d4 WatchSource:0}: Error finding container c21d5cdcf33dc5445d398db5efae2e61668498b313fd2a8200f2011b9857d1d4: Status 404 returned error can't find the container with id c21d5cdcf33dc5445d398db5efae2e61668498b313fd2a8200f2011b9857d1d4 Mar 19 11:50:56.915667 master-0 kubenswrapper[4029]: E0319 11:50:56.915603 4029 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 19 11:50:56.967149 master-0 kubenswrapper[4029]: W0319 11:50:56.967040 4029 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:50:56.967149 master-0 kubenswrapper[4029]: E0319 11:50:56.967125 4029 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:50:56.969147 master-0 kubenswrapper[4029]: W0319 11:50:56.969061 4029 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49fac1b46a11e49501805e891baae4a9.slice/crio-cc43d8901b61b03f0bff74bea5349f358784d720e2984f56ccc961dc3f630856 WatchSource:0}: Error finding container cc43d8901b61b03f0bff74bea5349f358784d720e2984f56ccc961dc3f630856: Status 404 returned error can't find the container with id cc43d8901b61b03f0bff74bea5349f358784d720e2984f56ccc961dc3f630856 Mar 19 11:50:57.038903 master-0 kubenswrapper[4029]: W0319 11:50:57.038833 4029 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:50:57.039078 master-0 kubenswrapper[4029]: E0319 11:50:57.038918 4029 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:50:57.136906 master-0 kubenswrapper[4029]: I0319 11:50:57.136824 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:50:57.137867 master-0 kubenswrapper[4029]: I0319 11:50:57.137841 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:50:57.137928 master-0 kubenswrapper[4029]: I0319 11:50:57.137883 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:50:57.137928 master-0 kubenswrapper[4029]: I0319 11:50:57.137896 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:50:57.137994 master-0 kubenswrapper[4029]: I0319 11:50:57.137933 4029 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:50:57.138831 master-0 kubenswrapper[4029]: E0319 11:50:57.138786 4029 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 11:50:57.456940 master-0 kubenswrapper[4029]: I0319 11:50:57.456880 4029 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 11:50:57.458531 master-0 kubenswrapper[4029]: E0319 11:50:57.458465 4029 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:50:57.507057 master-0 kubenswrapper[4029]: I0319 11:50:57.506967 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:50:57.537906 master-0 kubenswrapper[4029]: W0319 11:50:57.537870 4029 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd664a6d0d2a24360dee10612610f1b59.slice/crio-34e072f72c93d6369874c6beffaed27fe7a497ddbd4993eb86f92f576e79b6ab WatchSource:0}: Error finding container 34e072f72c93d6369874c6beffaed27fe7a497ddbd4993eb86f92f576e79b6ab: Status 404 returned error can't find the container with id 34e072f72c93d6369874c6beffaed27fe7a497ddbd4993eb86f92f576e79b6ab Mar 19 11:50:57.656797 master-0 kubenswrapper[4029]: I0319 11:50:57.656702 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"34e072f72c93d6369874c6beffaed27fe7a497ddbd4993eb86f92f576e79b6ab"} Mar 19 11:50:57.657721 master-0 kubenswrapper[4029]: I0319 11:50:57.657643 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"cc43d8901b61b03f0bff74bea5349f358784d720e2984f56ccc961dc3f630856"} Mar 19 11:50:57.658847 master-0 kubenswrapper[4029]: I0319 11:50:57.658813 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"c21d5cdcf33dc5445d398db5efae2e61668498b313fd2a8200f2011b9857d1d4"} Mar 19 11:50:57.659716 master-0 kubenswrapper[4029]: I0319 11:50:57.659693 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"c7fce19a33a5dd46ce06e3ec2001f8aae0d2c521be7c2647e59448b0833408c9"} Mar 19 11:50:57.660466 master-0 kubenswrapper[4029]: I0319 11:50:57.660445 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"3ca871a2e4c187593092b1e6a4a9637d7435e4628b01bcadfea7c6a9560eeb21"} Mar 19 11:50:58.507096 master-0 kubenswrapper[4029]: I0319 11:50:58.507031 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:50:58.517021 master-0 kubenswrapper[4029]: E0319 11:50:58.516961 4029 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 19 11:50:58.531281 master-0 kubenswrapper[4029]: W0319 11:50:58.531205 4029 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:50:58.531488 master-0 kubenswrapper[4029]: E0319 11:50:58.531289 4029 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:50:58.739996 master-0 kubenswrapper[4029]: I0319 11:50:58.739930 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:50:58.741453 master-0 kubenswrapper[4029]: I0319 11:50:58.741405 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:50:58.741529 master-0 kubenswrapper[4029]: I0319 11:50:58.741473 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:50:58.741529 master-0 kubenswrapper[4029]: I0319 11:50:58.741490 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:50:58.741631 master-0 kubenswrapper[4029]: I0319 11:50:58.741567 4029 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:50:58.742909 master-0 kubenswrapper[4029]: E0319 11:50:58.742833 4029 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 11:50:59.195551 master-0 kubenswrapper[4029]: W0319 11:50:59.195497 4029 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:50:59.195551 master-0 kubenswrapper[4029]: E0319 11:50:59.195557 4029 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:50:59.228408 master-0 kubenswrapper[4029]: W0319 11:50:59.228321 4029 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:50:59.228408 master-0 kubenswrapper[4029]: E0319 11:50:59.228377 4029 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:50:59.639596 master-0 kubenswrapper[4029]: I0319 11:50:59.639438 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:50:59.639596 master-0 kubenswrapper[4029]: W0319 11:50:59.639475 4029 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:50:59.639596 master-0 kubenswrapper[4029]: E0319 11:50:59.639573 4029 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:50:59.666139 master-0 kubenswrapper[4029]: I0319 11:50:59.666068 4029 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="5b728a95b5ae31dab98e905315ad7bc4e11c06682ed7961c2f8d666cf463933f" exitCode=0 Mar 19 11:50:59.666139 master-0 kubenswrapper[4029]: I0319 11:50:59.666112 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"5b728a95b5ae31dab98e905315ad7bc4e11c06682ed7961c2f8d666cf463933f"} Mar 19 11:50:59.666393 master-0 kubenswrapper[4029]: I0319 11:50:59.666166 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:50:59.666948 master-0 kubenswrapper[4029]: I0319 11:50:59.666922 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:50:59.666997 master-0 kubenswrapper[4029]: I0319 11:50:59.666953 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:50:59.666997 master-0 kubenswrapper[4029]: I0319 11:50:59.666966 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:50:59.701659 master-0 kubenswrapper[4029]: E0319 11:50:59.701547 4029 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189e3bcd24dd952d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.503389997 +0000 UTC m=+0.580266584,LastTimestamp:2026-03-19 11:50:55.503389997 +0000 UTC m=+0.580266584,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:00.506846 master-0 kubenswrapper[4029]: I0319 11:51:00.506590 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:01.507119 master-0 kubenswrapper[4029]: I0319 11:51:01.506958 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:01.668790 master-0 kubenswrapper[4029]: I0319 11:51:01.668702 4029 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 11:51:01.670070 master-0 kubenswrapper[4029]: E0319 11:51:01.670038 4029 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:51:01.675242 master-0 kubenswrapper[4029]: I0319 11:51:01.675172 4029 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/0.log" Mar 19 11:51:01.675684 master-0 kubenswrapper[4029]: I0319 11:51:01.675635 4029 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="8dfbf4e523f1cef5f1c4787dca02d5625f27520f1b4447d1520ca821835754d8" exitCode=1 Mar 19 11:51:01.675684 master-0 kubenswrapper[4029]: I0319 11:51:01.675677 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"8dfbf4e523f1cef5f1c4787dca02d5625f27520f1b4447d1520ca821835754d8"} Mar 19 11:51:01.675790 master-0 kubenswrapper[4029]: I0319 11:51:01.675739 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:01.676911 master-0 kubenswrapper[4029]: I0319 11:51:01.676855 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:01.676957 master-0 kubenswrapper[4029]: I0319 11:51:01.676925 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:01.676957 master-0 kubenswrapper[4029]: I0319 11:51:01.676945 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:01.677463 master-0 kubenswrapper[4029]: I0319 11:51:01.677433 4029 scope.go:117] "RemoveContainer" containerID="8dfbf4e523f1cef5f1c4787dca02d5625f27520f1b4447d1520ca821835754d8" Mar 19 11:51:01.718568 master-0 kubenswrapper[4029]: E0319 11:51:01.718479 4029 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 19 11:51:01.943386 master-0 kubenswrapper[4029]: I0319 11:51:01.943347 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:01.944656 master-0 kubenswrapper[4029]: I0319 11:51:01.944631 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:01.944761 master-0 kubenswrapper[4029]: I0319 11:51:01.944669 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:01.944761 master-0 kubenswrapper[4029]: I0319 11:51:01.944687 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:01.944761 master-0 kubenswrapper[4029]: I0319 11:51:01.944754 4029 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:51:01.945803 master-0 kubenswrapper[4029]: E0319 11:51:01.945717 4029 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 11:51:02.507120 master-0 kubenswrapper[4029]: I0319 11:51:02.507061 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:02.719357 master-0 kubenswrapper[4029]: W0319 11:51:02.719256 4029 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:02.719357 master-0 kubenswrapper[4029]: E0319 11:51:02.719337 4029 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:51:02.733119 master-0 kubenswrapper[4029]: I0319 11:51:02.733084 4029 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/0.log" Mar 19 11:51:02.733562 master-0 kubenswrapper[4029]: I0319 11:51:02.733533 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"8bd372e238e986e0984170cb681e728cd33a0ebd43bfa74f52c87a36d11bdbf9"} Mar 19 11:51:02.952356 master-0 kubenswrapper[4029]: W0319 11:51:02.952244 4029 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:02.952356 master-0 kubenswrapper[4029]: E0319 11:51:02.952343 4029 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:51:03.506377 master-0 kubenswrapper[4029]: I0319 11:51:03.506285 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:03.667689 master-0 kubenswrapper[4029]: W0319 11:51:03.667604 4029 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:03.667689 master-0 kubenswrapper[4029]: E0319 11:51:03.667692 4029 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:51:03.737358 master-0 kubenswrapper[4029]: I0319 11:51:03.737319 4029 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/1.log" Mar 19 11:51:03.737817 master-0 kubenswrapper[4029]: I0319 11:51:03.737800 4029 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/0.log" Mar 19 11:51:03.738136 master-0 kubenswrapper[4029]: I0319 11:51:03.738109 4029 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="8bd372e238e986e0984170cb681e728cd33a0ebd43bfa74f52c87a36d11bdbf9" exitCode=1 Mar 19 11:51:03.738171 master-0 kubenswrapper[4029]: I0319 11:51:03.738141 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"8bd372e238e986e0984170cb681e728cd33a0ebd43bfa74f52c87a36d11bdbf9"} Mar 19 11:51:03.738202 master-0 kubenswrapper[4029]: I0319 11:51:03.738187 4029 scope.go:117] "RemoveContainer" containerID="8dfbf4e523f1cef5f1c4787dca02d5625f27520f1b4447d1520ca821835754d8" Mar 19 11:51:03.738326 master-0 kubenswrapper[4029]: I0319 11:51:03.738291 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:03.739243 master-0 kubenswrapper[4029]: I0319 11:51:03.739219 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:03.739293 master-0 kubenswrapper[4029]: I0319 11:51:03.739248 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:03.739293 master-0 kubenswrapper[4029]: I0319 11:51:03.739260 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:03.739555 master-0 kubenswrapper[4029]: I0319 11:51:03.739534 4029 scope.go:117] "RemoveContainer" containerID="8bd372e238e986e0984170cb681e728cd33a0ebd43bfa74f52c87a36d11bdbf9" Mar 19 11:51:03.739694 master-0 kubenswrapper[4029]: E0319 11:51:03.739672 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="1249822f86f23526277d165c0d5d3c19" Mar 19 11:51:04.507348 master-0 kubenswrapper[4029]: I0319 11:51:04.507300 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:04.740296 master-0 kubenswrapper[4029]: I0319 11:51:04.740258 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:04.741045 master-0 kubenswrapper[4029]: I0319 11:51:04.741016 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:04.741045 master-0 kubenswrapper[4029]: I0319 11:51:04.741040 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:04.741164 master-0 kubenswrapper[4029]: I0319 11:51:04.741051 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:04.741318 master-0 kubenswrapper[4029]: I0319 11:51:04.741292 4029 scope.go:117] "RemoveContainer" containerID="8bd372e238e986e0984170cb681e728cd33a0ebd43bfa74f52c87a36d11bdbf9" Mar 19 11:51:04.741437 master-0 kubenswrapper[4029]: E0319 11:51:04.741411 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="1249822f86f23526277d165c0d5d3c19" Mar 19 11:51:04.870438 master-0 kubenswrapper[4029]: W0319 11:51:04.870294 4029 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:04.870438 master-0 kubenswrapper[4029]: E0319 11:51:04.870371 4029 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:51:05.507357 master-0 kubenswrapper[4029]: I0319 11:51:05.507290 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:05.632310 master-0 kubenswrapper[4029]: E0319 11:51:05.632251 4029 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 11:51:06.507402 master-0 kubenswrapper[4029]: I0319 11:51:06.507279 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:06.747543 master-0 kubenswrapper[4029]: I0319 11:51:06.747240 4029 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/1.log" Mar 19 11:51:07.507236 master-0 kubenswrapper[4029]: I0319 11:51:07.507114 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:07.752053 master-0 kubenswrapper[4029]: I0319 11:51:07.751984 4029 generic.go:334] "Generic (PLEG): container finished" podID="49fac1b46a11e49501805e891baae4a9" containerID="d3ab6ca62e19d2ef407ccf237743444ad88802357f607cafd2e5c6b8ac29d477" exitCode=0 Mar 19 11:51:07.752053 master-0 kubenswrapper[4029]: I0319 11:51:07.752051 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerDied","Data":"d3ab6ca62e19d2ef407ccf237743444ad88802357f607cafd2e5c6b8ac29d477"} Mar 19 11:51:07.753203 master-0 kubenswrapper[4029]: I0319 11:51:07.752171 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:07.753203 master-0 kubenswrapper[4029]: I0319 11:51:07.753157 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:07.753203 master-0 kubenswrapper[4029]: I0319 11:51:07.753185 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:07.753203 master-0 kubenswrapper[4029]: I0319 11:51:07.753200 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:07.755266 master-0 kubenswrapper[4029]: I0319 11:51:07.755188 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"58cc59848776f9368dd32da99bd6c9b9284f95df012df470d98ae16fe81785f6"} Mar 19 11:51:07.757278 master-0 kubenswrapper[4029]: I0319 11:51:07.757151 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"901ed10fec5e9417fcd7522a27f15f9a949e9c0dd2ab8e429fd9b30afd0247bf"} Mar 19 11:51:07.757348 master-0 kubenswrapper[4029]: I0319 11:51:07.757286 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:07.758099 master-0 kubenswrapper[4029]: I0319 11:51:07.758073 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:07.758652 master-0 kubenswrapper[4029]: I0319 11:51:07.758621 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:07.758652 master-0 kubenswrapper[4029]: I0319 11:51:07.758647 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:07.758788 master-0 kubenswrapper[4029]: I0319 11:51:07.758659 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:07.759101 master-0 kubenswrapper[4029]: I0319 11:51:07.759074 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:07.759101 master-0 kubenswrapper[4029]: I0319 11:51:07.759096 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:07.759199 master-0 kubenswrapper[4029]: I0319 11:51:07.759107 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:07.760784 master-0 kubenswrapper[4029]: I0319 11:51:07.760740 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"863c1805db648a1c68dab43606fe7bf357e1d14504af2989916cb369fe922861"} Mar 19 11:51:07.760831 master-0 kubenswrapper[4029]: I0319 11:51:07.760785 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"b9695d95c55ced36d31ee4b3802610d675e3206471662e3165ad086a92a3332c"} Mar 19 11:51:07.760909 master-0 kubenswrapper[4029]: I0319 11:51:07.760883 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:07.761847 master-0 kubenswrapper[4029]: I0319 11:51:07.761804 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:07.761910 master-0 kubenswrapper[4029]: I0319 11:51:07.761865 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:07.761910 master-0 kubenswrapper[4029]: I0319 11:51:07.761880 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:08.346721 master-0 kubenswrapper[4029]: I0319 11:51:08.346657 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:08.348226 master-0 kubenswrapper[4029]: I0319 11:51:08.347761 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:08.348226 master-0 kubenswrapper[4029]: I0319 11:51:08.347808 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:08.348226 master-0 kubenswrapper[4029]: I0319 11:51:08.347819 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:08.348226 master-0 kubenswrapper[4029]: I0319 11:51:08.347865 4029 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:51:08.766868 master-0 kubenswrapper[4029]: I0319 11:51:08.766771 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"21c17e15f1723f8eb75ec60f42ebd73c793697e640249886764928c881dbaaa1"} Mar 19 11:51:08.766868 master-0 kubenswrapper[4029]: I0319 11:51:08.766827 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:08.767615 master-0 kubenswrapper[4029]: I0319 11:51:08.766828 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:08.769405 master-0 kubenswrapper[4029]: I0319 11:51:08.768697 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:08.769405 master-0 kubenswrapper[4029]: I0319 11:51:08.768754 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:08.769405 master-0 kubenswrapper[4029]: I0319 11:51:08.768954 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:08.769405 master-0 kubenswrapper[4029]: I0319 11:51:08.769197 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:08.769405 master-0 kubenswrapper[4029]: I0319 11:51:08.769289 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:08.769405 master-0 kubenswrapper[4029]: I0319 11:51:08.769318 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:09.474424 master-0 kubenswrapper[4029]: I0319 11:51:09.474357 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:51:09.474424 master-0 kubenswrapper[4029]: E0319 11:51:09.474365 4029 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 19 11:51:09.474424 master-0 kubenswrapper[4029]: E0319 11:51:09.474370 4029 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 11:51:09.512253 master-0 kubenswrapper[4029]: I0319 11:51:09.512206 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:51:09.715707 master-0 kubenswrapper[4029]: E0319 11:51:09.714977 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bcd24dd952d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.503389997 +0000 UTC m=+0.580266584,LastTimestamp:2026-03-19 11:50:55.503389997 +0000 UTC m=+0.580266584,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.723372 master-0 kubenswrapper[4029]: E0319 11:51:09.723189 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bcd274011bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.543398844 +0000 UTC m=+0.620275441,LastTimestamp:2026-03-19 11:50:55.543398844 +0000 UTC m=+0.620275441,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.729829 master-0 kubenswrapper[4029]: E0319 11:51:09.729566 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bcd2740889a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.543429274 +0000 UTC m=+0.620305851,LastTimestamp:2026-03-19 11:50:55.543429274 +0000 UTC m=+0.620305851,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.735194 master-0 kubenswrapper[4029]: E0319 11:51:09.735068 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bcd2740b6bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.543441084 +0000 UTC m=+0.620317661,LastTimestamp:2026-03-19 11:50:55.543441084 +0000 UTC m=+0.620317661,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.740408 master-0 kubenswrapper[4029]: E0319 11:51:09.740258 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bcd2d3d43ff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.643878399 +0000 UTC m=+0.720754976,LastTimestamp:2026-03-19 11:50:55.643878399 +0000 UTC m=+0.720754976,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.745938 master-0 kubenswrapper[4029]: E0319 11:51:09.745834 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bcd274011bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bcd274011bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.543398844 +0000 UTC m=+0.620275441,LastTimestamp:2026-03-19 11:50:55.73067995 +0000 UTC m=+0.807556507,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.750755 master-0 kubenswrapper[4029]: E0319 11:51:09.750595 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bcd2740889a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bcd2740889a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.543429274 +0000 UTC m=+0.620305851,LastTimestamp:2026-03-19 11:50:55.73069349 +0000 UTC m=+0.807570057,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.754855 master-0 kubenswrapper[4029]: E0319 11:51:09.754780 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bcd2740b6bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bcd2740b6bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.543441084 +0000 UTC m=+0.620317661,LastTimestamp:2026-03-19 11:50:55.73070249 +0000 UTC m=+0.807579047,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.758487 master-0 kubenswrapper[4029]: E0319 11:51:09.758352 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bcd274011bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bcd274011bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.543398844 +0000 UTC m=+0.620275441,LastTimestamp:2026-03-19 11:50:55.75351854 +0000 UTC m=+0.830395107,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.762670 master-0 kubenswrapper[4029]: E0319 11:51:09.762500 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bcd2740889a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bcd2740889a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.543429274 +0000 UTC m=+0.620305851,LastTimestamp:2026-03-19 11:50:55.75353524 +0000 UTC m=+0.830411807,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.767411 master-0 kubenswrapper[4029]: E0319 11:51:09.767287 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bcd2740b6bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bcd2740b6bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.543441084 +0000 UTC m=+0.620317661,LastTimestamp:2026-03-19 11:50:55.75354376 +0000 UTC m=+0.830420327,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.770206 master-0 kubenswrapper[4029]: I0319 11:51:09.770162 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"6ecb192a1cfeb4529102ad33aeed1229502ac0d4a0688a01c8e90bffa6cdc39c"} Mar 19 11:51:09.770296 master-0 kubenswrapper[4029]: I0319 11:51:09.770261 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:09.771053 master-0 kubenswrapper[4029]: I0319 11:51:09.771010 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:09.771053 master-0 kubenswrapper[4029]: I0319 11:51:09.771052 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:09.771151 master-0 kubenswrapper[4029]: I0319 11:51:09.771064 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:09.772675 master-0 kubenswrapper[4029]: E0319 11:51:09.772582 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bcd274011bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bcd274011bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.543398844 +0000 UTC m=+0.620275441,LastTimestamp:2026-03-19 11:50:55.754612224 +0000 UTC m=+0.831488791,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.776957 master-0 kubenswrapper[4029]: E0319 11:51:09.776864 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bcd2740889a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bcd2740889a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.543429274 +0000 UTC m=+0.620305851,LastTimestamp:2026-03-19 11:50:55.754631084 +0000 UTC m=+0.831507651,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.780916 master-0 kubenswrapper[4029]: E0319 11:51:09.780786 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bcd2740b6bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bcd2740b6bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.543441084 +0000 UTC m=+0.620317661,LastTimestamp:2026-03-19 11:50:55.754639944 +0000 UTC m=+0.831516511,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.790209 master-0 kubenswrapper[4029]: E0319 11:51:09.790073 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bcd274011bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bcd274011bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.543398844 +0000 UTC m=+0.620275441,LastTimestamp:2026-03-19 11:50:55.756043682 +0000 UTC m=+0.832920259,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.795805 master-0 kubenswrapper[4029]: E0319 11:51:09.795644 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bcd2740889a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bcd2740889a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.543429274 +0000 UTC m=+0.620305851,LastTimestamp:2026-03-19 11:50:55.756063622 +0000 UTC m=+0.832940189,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.800941 master-0 kubenswrapper[4029]: E0319 11:51:09.800806 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bcd2740b6bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bcd2740b6bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.543441084 +0000 UTC m=+0.620317661,LastTimestamp:2026-03-19 11:50:55.756075392 +0000 UTC m=+0.832951959,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.806885 master-0 kubenswrapper[4029]: E0319 11:51:09.806742 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bcd274011bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bcd274011bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.543398844 +0000 UTC m=+0.620275441,LastTimestamp:2026-03-19 11:50:55.75663337 +0000 UTC m=+0.833509937,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.815131 master-0 kubenswrapper[4029]: E0319 11:51:09.814973 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bcd2740889a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bcd2740889a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.543429274 +0000 UTC m=+0.620305851,LastTimestamp:2026-03-19 11:50:55.7566571 +0000 UTC m=+0.833533677,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.820161 master-0 kubenswrapper[4029]: E0319 11:51:09.820048 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bcd2740b6bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bcd2740b6bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.543441084 +0000 UTC m=+0.620317661,LastTimestamp:2026-03-19 11:50:55.75667173 +0000 UTC m=+0.833548297,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.824905 master-0 kubenswrapper[4029]: E0319 11:51:09.824759 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bcd274011bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bcd274011bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.543398844 +0000 UTC m=+0.620275441,LastTimestamp:2026-03-19 11:50:55.756802792 +0000 UTC m=+0.833679359,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.829765 master-0 kubenswrapper[4029]: E0319 11:51:09.829651 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bcd2740889a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bcd2740889a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.543429274 +0000 UTC m=+0.620305851,LastTimestamp:2026-03-19 11:50:55.756814832 +0000 UTC m=+0.833691399,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.833623 master-0 kubenswrapper[4029]: E0319 11:51:09.833464 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bcd2740b6bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bcd2740b6bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.543441084 +0000 UTC m=+0.620317661,LastTimestamp:2026-03-19 11:50:55.756831742 +0000 UTC m=+0.833708309,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.837814 master-0 kubenswrapper[4029]: E0319 11:51:09.837666 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bcd274011bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bcd274011bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.543398844 +0000 UTC m=+0.620275441,LastTimestamp:2026-03-19 11:50:55.757803224 +0000 UTC m=+0.834679801,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.842810 master-0 kubenswrapper[4029]: E0319 11:51:09.842634 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bcd2740889a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bcd2740889a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:55.543429274 +0000 UTC m=+0.620305851,LastTimestamp:2026-03-19 11:50:55.757815994 +0000 UTC m=+0.834692561,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.848931 master-0 kubenswrapper[4029]: E0319 11:51:09.848784 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189e3bcd6e1a43b5 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:c83737980b9ee109184b1d78e942cf36,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:56.732103605 +0000 UTC m=+1.808980172,LastTimestamp:2026-03-19 11:50:56.732103605 +0000 UTC m=+1.808980172,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.853922 master-0 kubenswrapper[4029]: E0319 11:51:09.853797 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bcd7167f498 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:56.787526808 +0000 UTC m=+1.864403385,LastTimestamp:2026-03-19 11:50:56.787526808 +0000 UTC m=+1.864403385,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.858575 master-0 kubenswrapper[4029]: E0319 11:51:09.858415 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e3bcd72f3fbcb kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:56.813480907 +0000 UTC m=+1.890357474,LastTimestamp:2026-03-19 11:50:56.813480907 +0000 UTC m=+1.890357474,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.863996 master-0 kubenswrapper[4029]: E0319 11:51:09.863816 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3bcd7c5cb147 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:56.971338055 +0000 UTC m=+2.048214632,LastTimestamp:2026-03-19 11:50:56.971338055 +0000 UTC m=+2.048214632,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.869275 master-0 kubenswrapper[4029]: E0319 11:51:09.869126 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e3bcd9e4374bc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:57.5401095 +0000 UTC m=+2.616986067,LastTimestamp:2026-03-19 11:50:57.5401095 +0000 UTC m=+2.616986067,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.873752 master-0 kubenswrapper[4029]: E0319 11:51:09.873599 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bcde413641d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\" in 1.923s (1.923s including waiting). Image size: 465090934 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:58.711364637 +0000 UTC m=+3.788241204,LastTimestamp:2026-03-19 11:50:58.711364637 +0000 UTC m=+3.788241204,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.878155 master-0 kubenswrapper[4029]: E0319 11:51:09.877962 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bcdefaba747 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:58.905892679 +0000 UTC m=+3.982769246,LastTimestamp:2026-03-19 11:50:58.905892679 +0000 UTC m=+3.982769246,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.883459 master-0 kubenswrapper[4029]: E0319 11:51:09.883253 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bcdf0c5f247 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:58.924393031 +0000 UTC m=+4.001269598,LastTimestamp:2026-03-19 11:50:58.924393031 +0000 UTC m=+4.001269598,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.889247 master-0 kubenswrapper[4029]: E0319 11:51:09.888953 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bce1d2cf29c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:59.669340828 +0000 UTC m=+4.746217405,LastTimestamp:2026-03-19 11:50:59.669340828 +0000 UTC m=+4.746217405,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.896498 master-0 kubenswrapper[4029]: E0319 11:51:09.896279 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bce589e7c5a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:00.666637402 +0000 UTC m=+5.743513969,LastTimestamp:2026-03-19 11:51:00.666637402 +0000 UTC m=+5.743513969,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.903945 master-0 kubenswrapper[4029]: E0319 11:51:09.903769 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bce72b71e4a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:01.104459338 +0000 UTC m=+6.181335905,LastTimestamp:2026-03-19 11:51:01.104459338 +0000 UTC m=+6.181335905,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.910246 master-0 kubenswrapper[4029]: E0319 11:51:09.910036 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e3bce1d2cf29c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bce1d2cf29c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:59.669340828 +0000 UTC m=+4.746217405,LastTimestamp:2026-03-19 11:51:01.681721918 +0000 UTC m=+6.758598485,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.933969 master-0 kubenswrapper[4029]: E0319 11:51:09.932876 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e3bce589e7c5a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bce589e7c5a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:00.666637402 +0000 UTC m=+5.743513969,LastTimestamp:2026-03-19 11:51:02.389570321 +0000 UTC m=+7.466446878,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.944702 master-0 kubenswrapper[4029]: E0319 11:51:09.944494 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e3bce72b71e4a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bce72b71e4a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:01.104459338 +0000 UTC m=+6.181335905,LastTimestamp:2026-03-19 11:51:02.742180598 +0000 UTC m=+7.819057165,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.950492 master-0 kubenswrapper[4029]: E0319 11:51:09.950338 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bcf0fc8f2d6 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:03.739650774 +0000 UTC m=+8.816527341,LastTimestamp:2026-03-19 11:51:03.739650774 +0000 UTC m=+8.816527341,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.956242 master-0 kubenswrapper[4029]: E0319 11:51:09.956147 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e3bcf0fc8f2d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bcf0fc8f2d6 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:03.739650774 +0000 UTC m=+8.816527341,LastTimestamp:2026-03-19 11:51:04.741394902 +0000 UTC m=+9.818271469,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.961637 master-0 kubenswrapper[4029]: E0319 11:51:09.961460 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3bcfc3a25471 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" in 9.785s (9.785s including waiting). Image size: 943841779 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:06.757018737 +0000 UTC m=+11.833895304,LastTimestamp:2026-03-19 11:51:06.757018737 +0000 UTC m=+11.833895304,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.967690 master-0 kubenswrapper[4029]: E0319 11:51:09.967504 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e3bcfc3a32d7a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\" in 9.216s (9.216s including waiting). Image size: 529326739 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:06.757074298 +0000 UTC m=+11.833950855,LastTimestamp:2026-03-19 11:51:06.757074298 +0000 UTC m=+11.833950855,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.973158 master-0 kubenswrapper[4029]: E0319 11:51:09.973042 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189e3bcfc5a18a9e kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:c83737980b9ee109184b1d78e942cf36,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" in 10.058s (10.058s including waiting). Image size: 943841779 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:06.790521502 +0000 UTC m=+11.867398069,LastTimestamp:2026-03-19 11:51:06.790521502 +0000 UTC m=+11.867398069,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.978231 master-0 kubenswrapper[4029]: E0319 11:51:09.978123 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e3bcfc6a565b6 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" in 9.994s (9.994s including waiting). Image size: 943841779 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:06.807551414 +0000 UTC m=+11.884427981,LastTimestamp:2026-03-19 11:51:06.807551414 +0000 UTC m=+11.884427981,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.982601 master-0 kubenswrapper[4029]: E0319 11:51:09.982424 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189e3bcfcf056dd4 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:c83737980b9ee109184b1d78e942cf36,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container: kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:06.948062676 +0000 UTC m=+12.024939243,LastTimestamp:2026-03-19 11:51:06.948062676 +0000 UTC m=+12.024939243,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.986651 master-0 kubenswrapper[4029]: E0319 11:51:09.986564 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3bcfcf0bec59 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:06.948488281 +0000 UTC m=+12.025364848,LastTimestamp:2026-03-19 11:51:06.948488281 +0000 UTC m=+12.025364848,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.990909 master-0 kubenswrapper[4029]: E0319 11:51:09.990839 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e3bcfcf0e9efc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container: etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:06.948665084 +0000 UTC m=+12.025541651,LastTimestamp:2026-03-19 11:51:06.948665084 +0000 UTC m=+12.025541651,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.994669 master-0 kubenswrapper[4029]: E0319 11:51:09.994604 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189e3bcfcfc42a69 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:c83737980b9ee109184b1d78e942cf36,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:06.960562793 +0000 UTC m=+12.037439360,LastTimestamp:2026-03-19 11:51:06.960562793 +0000 UTC m=+12.037439360,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:09.998916 master-0 kubenswrapper[4029]: E0319 11:51:09.998740 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e3bcfcfc60bfa openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:06.960686074 +0000 UTC m=+12.037562641,LastTimestamp:2026-03-19 11:51:06.960686074 +0000 UTC m=+12.037562641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:10.003353 master-0 kubenswrapper[4029]: E0319 11:51:10.003254 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3bcfcfe2eff1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:06.962579441 +0000 UTC m=+12.039456008,LastTimestamp:2026-03-19 11:51:06.962579441 +0000 UTC m=+12.039456008,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:10.006777 master-0 kubenswrapper[4029]: E0319 11:51:10.006692 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e3bcfcff2465d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:06.963584605 +0000 UTC m=+12.040461172,LastTimestamp:2026-03-19 11:51:06.963584605 +0000 UTC m=+12.040461172,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:10.010374 master-0 kubenswrapper[4029]: E0319 11:51:10.010302 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e3bcfd21f1ca1 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:07.000077473 +0000 UTC m=+12.076954040,LastTimestamp:2026-03-19 11:51:07.000077473 +0000 UTC m=+12.076954040,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:10.015231 master-0 kubenswrapper[4029]: E0319 11:51:10.015169 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e3bcfd2f22f11 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:07.013910289 +0000 UTC m=+12.090786856,LastTimestamp:2026-03-19 11:51:07.013910289 +0000 UTC m=+12.090786856,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:10.019010 master-0 kubenswrapper[4029]: E0319 11:51:10.018685 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e3bcfd3012187 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:07.014889863 +0000 UTC m=+12.091766430,LastTimestamp:2026-03-19 11:51:07.014889863 +0000 UTC m=+12.091766430,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:10.023378 master-0 kubenswrapper[4029]: E0319 11:51:10.023301 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e3bcfdb25919a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container: etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:07.151495578 +0000 UTC m=+12.228372135,LastTimestamp:2026-03-19 11:51:07.151495578 +0000 UTC m=+12.228372135,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:10.027333 master-0 kubenswrapper[4029]: E0319 11:51:10.027203 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e3bcfdbefb645 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:07.164743237 +0000 UTC m=+12.241619804,LastTimestamp:2026-03-19 11:51:07.164743237 +0000 UTC m=+12.241619804,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:10.034382 master-0 kubenswrapper[4029]: E0319 11:51:10.032176 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3bcfff4bdc56 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:07.757984854 +0000 UTC m=+12.834861431,LastTimestamp:2026-03-19 11:51:07.757984854 +0000 UTC m=+12.834861431,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:10.038854 master-0 kubenswrapper[4029]: E0319 11:51:10.038777 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3bd00bbc0675 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container: kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:07.966662261 +0000 UTC m=+13.043538828,LastTimestamp:2026-03-19 11:51:07.966662261 +0000 UTC m=+13.043538828,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:10.043372 master-0 kubenswrapper[4029]: E0319 11:51:10.043291 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3bd00d57d782 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:07.993651074 +0000 UTC m=+13.070527641,LastTimestamp:2026-03-19 11:51:07.993651074 +0000 UTC m=+13.070527641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:10.048329 master-0 kubenswrapper[4029]: E0319 11:51:10.048224 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3bd00d6fa3be openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:07.995210686 +0000 UTC m=+13.072087253,LastTimestamp:2026-03-19 11:51:07.995210686 +0000 UTC m=+13.072087253,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:10.053306 master-0 kubenswrapper[4029]: E0319 11:51:10.053187 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e3bd067a01318 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\" in 2.493s (2.493s including waiting). Image size: 505246690 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:09.50833436 +0000 UTC m=+14.585210937,LastTimestamp:2026-03-19 11:51:09.50833436 +0000 UTC m=+14.585210937,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:10.058056 master-0 kubenswrapper[4029]: E0319 11:51:10.057940 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e3bd073f98015 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container: cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:09.715521557 +0000 UTC m=+14.792398134,LastTimestamp:2026-03-19 11:51:09.715521557 +0000 UTC m=+14.792398134,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:10.065756 master-0 kubenswrapper[4029]: E0319 11:51:10.065563 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e3bd074ae7c1b kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:09.727382555 +0000 UTC m=+14.804259122,LastTimestamp:2026-03-19 11:51:09.727382555 +0000 UTC m=+14.804259122,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:10.318875 master-0 kubenswrapper[4029]: I0319 11:51:10.318595 4029 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 11:51:10.336444 master-0 kubenswrapper[4029]: I0319 11:51:10.336345 4029 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 19 11:51:10.518508 master-0 kubenswrapper[4029]: I0319 11:51:10.518440 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:51:10.772669 master-0 kubenswrapper[4029]: I0319 11:51:10.772598 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:10.773888 master-0 kubenswrapper[4029]: I0319 11:51:10.773837 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:10.773935 master-0 kubenswrapper[4029]: I0319 11:51:10.773913 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:10.773935 master-0 kubenswrapper[4029]: I0319 11:51:10.773930 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:11.514236 master-0 kubenswrapper[4029]: I0319 11:51:11.514161 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:51:12.234593 master-0 kubenswrapper[4029]: I0319 11:51:12.234047 4029 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:51:12.234593 master-0 kubenswrapper[4029]: I0319 11:51:12.234309 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:12.235969 master-0 kubenswrapper[4029]: I0319 11:51:12.235901 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:12.236027 master-0 kubenswrapper[4029]: I0319 11:51:12.235977 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:12.236027 master-0 kubenswrapper[4029]: I0319 11:51:12.236019 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:12.239954 master-0 kubenswrapper[4029]: I0319 11:51:12.239922 4029 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:51:12.397783 master-0 kubenswrapper[4029]: W0319 11:51:12.397714 4029 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 19 11:51:12.398045 master-0 kubenswrapper[4029]: E0319 11:51:12.397804 4029 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 19 11:51:12.511968 master-0 kubenswrapper[4029]: I0319 11:51:12.511839 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:51:12.535318 master-0 kubenswrapper[4029]: W0319 11:51:12.535273 4029 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-0" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 19 11:51:12.535549 master-0 kubenswrapper[4029]: E0319 11:51:12.535358 4029 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-0\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 19 11:51:12.778085 master-0 kubenswrapper[4029]: I0319 11:51:12.777908 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:12.778085 master-0 kubenswrapper[4029]: I0319 11:51:12.778027 4029 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:51:12.779206 master-0 kubenswrapper[4029]: I0319 11:51:12.779118 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:12.779206 master-0 kubenswrapper[4029]: I0319 11:51:12.779161 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:12.779206 master-0 kubenswrapper[4029]: I0319 11:51:12.779172 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:12.848562 master-0 kubenswrapper[4029]: W0319 11:51:12.848365 4029 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 19 11:51:12.848562 master-0 kubenswrapper[4029]: E0319 11:51:12.848534 4029 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 19 11:51:12.895955 master-0 kubenswrapper[4029]: E0319 11:51:12.895540 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3bd130e68710 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\" in 4.889s (4.889s including waiting). Image size: 514984269 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:12.885171984 +0000 UTC m=+17.962048551,LastTimestamp:2026-03-19 11:51:12.885171984 +0000 UTC m=+17.962048551,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:13.238352 master-0 kubenswrapper[4029]: E0319 11:51:13.237706 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3bd14569c2b4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container: kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:13.229316788 +0000 UTC m=+18.306193355,LastTimestamp:2026-03-19 11:51:13.229316788 +0000 UTC m=+18.306193355,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:13.430830 master-0 kubenswrapper[4029]: E0319 11:51:13.430330 4029 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3bd150e344a0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:13.42182928 +0000 UTC m=+18.498705857,LastTimestamp:2026-03-19 11:51:13.42182928 +0000 UTC m=+18.498705857,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:13.500858 master-0 kubenswrapper[4029]: W0319 11:51:13.500667 4029 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 19 11:51:13.500858 master-0 kubenswrapper[4029]: E0319 11:51:13.500842 4029 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 19 11:51:13.511441 master-0 kubenswrapper[4029]: I0319 11:51:13.511367 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:51:13.784477 master-0 kubenswrapper[4029]: I0319 11:51:13.784376 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:13.784477 master-0 kubenswrapper[4029]: I0319 11:51:13.784389 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"39c756c5e9204811d8c83cfa45ff7447029413f92b87a61b82da1dc41e1a076d"} Mar 19 11:51:13.784477 master-0 kubenswrapper[4029]: I0319 11:51:13.784376 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:13.785477 master-0 kubenswrapper[4029]: I0319 11:51:13.785423 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:13.785477 master-0 kubenswrapper[4029]: I0319 11:51:13.785477 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:13.785661 master-0 kubenswrapper[4029]: I0319 11:51:13.785483 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:13.785661 master-0 kubenswrapper[4029]: I0319 11:51:13.785513 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:13.785661 master-0 kubenswrapper[4029]: I0319 11:51:13.785523 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:13.785661 master-0 kubenswrapper[4029]: I0319 11:51:13.785490 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:13.939585 master-0 kubenswrapper[4029]: I0319 11:51:13.939493 4029 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:51:13.943661 master-0 kubenswrapper[4029]: I0319 11:51:13.943625 4029 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:51:14.513469 master-0 kubenswrapper[4029]: I0319 11:51:14.513404 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:51:14.786984 master-0 kubenswrapper[4029]: I0319 11:51:14.786757 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:14.786984 master-0 kubenswrapper[4029]: I0319 11:51:14.786964 4029 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:51:14.787982 master-0 kubenswrapper[4029]: I0319 11:51:14.787940 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:14.788069 master-0 kubenswrapper[4029]: I0319 11:51:14.787987 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:14.788069 master-0 kubenswrapper[4029]: I0319 11:51:14.788002 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:14.794567 master-0 kubenswrapper[4029]: I0319 11:51:14.794506 4029 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:51:15.513797 master-0 kubenswrapper[4029]: I0319 11:51:15.513713 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:51:15.632597 master-0 kubenswrapper[4029]: E0319 11:51:15.632491 4029 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 11:51:15.789484 master-0 kubenswrapper[4029]: I0319 11:51:15.789255 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:15.790364 master-0 kubenswrapper[4029]: I0319 11:51:15.790265 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:15.790364 master-0 kubenswrapper[4029]: I0319 11:51:15.790359 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:15.790587 master-0 kubenswrapper[4029]: I0319 11:51:15.790382 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:16.474797 master-0 kubenswrapper[4029]: I0319 11:51:16.474633 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:16.476272 master-0 kubenswrapper[4029]: I0319 11:51:16.476201 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:16.476272 master-0 kubenswrapper[4029]: I0319 11:51:16.476266 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:16.476272 master-0 kubenswrapper[4029]: I0319 11:51:16.476277 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:16.476569 master-0 kubenswrapper[4029]: I0319 11:51:16.476357 4029 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:51:16.482067 master-0 kubenswrapper[4029]: E0319 11:51:16.482005 4029 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 11:51:16.482067 master-0 kubenswrapper[4029]: E0319 11:51:16.482021 4029 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 19 11:51:16.512260 master-0 kubenswrapper[4029]: I0319 11:51:16.512178 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:51:16.792116 master-0 kubenswrapper[4029]: I0319 11:51:16.791905 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:16.793114 master-0 kubenswrapper[4029]: I0319 11:51:16.793058 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:16.793114 master-0 kubenswrapper[4029]: I0319 11:51:16.793111 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:16.793114 master-0 kubenswrapper[4029]: I0319 11:51:16.793122 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:17.236293 master-0 kubenswrapper[4029]: I0319 11:51:17.236185 4029 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:51:17.236644 master-0 kubenswrapper[4029]: I0319 11:51:17.236533 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:17.238161 master-0 kubenswrapper[4029]: I0319 11:51:17.238089 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:17.238317 master-0 kubenswrapper[4029]: I0319 11:51:17.238168 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:17.238317 master-0 kubenswrapper[4029]: I0319 11:51:17.238193 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:17.512479 master-0 kubenswrapper[4029]: I0319 11:51:17.512313 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:51:18.514739 master-0 kubenswrapper[4029]: I0319 11:51:18.514546 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:51:19.054037 master-0 kubenswrapper[4029]: I0319 11:51:19.053957 4029 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:51:19.054363 master-0 kubenswrapper[4029]: I0319 11:51:19.054320 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:19.056910 master-0 kubenswrapper[4029]: I0319 11:51:19.056857 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:19.056987 master-0 kubenswrapper[4029]: I0319 11:51:19.056924 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:19.056987 master-0 kubenswrapper[4029]: I0319 11:51:19.056944 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:19.058467 master-0 kubenswrapper[4029]: I0319 11:51:19.058424 4029 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:51:19.058600 master-0 kubenswrapper[4029]: I0319 11:51:19.058547 4029 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:51:19.513182 master-0 kubenswrapper[4029]: I0319 11:51:19.512942 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:51:19.652875 master-0 kubenswrapper[4029]: I0319 11:51:19.652760 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:19.654638 master-0 kubenswrapper[4029]: I0319 11:51:19.654560 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:19.654638 master-0 kubenswrapper[4029]: I0319 11:51:19.654639 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:19.654850 master-0 kubenswrapper[4029]: I0319 11:51:19.654659 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:19.655447 master-0 kubenswrapper[4029]: I0319 11:51:19.655401 4029 scope.go:117] "RemoveContainer" containerID="8bd372e238e986e0984170cb681e728cd33a0ebd43bfa74f52c87a36d11bdbf9" Mar 19 11:51:19.670634 master-0 kubenswrapper[4029]: E0319 11:51:19.670416 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e3bce1d2cf29c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bce1d2cf29c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:50:59.669340828 +0000 UTC m=+4.746217405,LastTimestamp:2026-03-19 11:51:19.659679727 +0000 UTC m=+24.736556334,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:19.723027 master-0 kubenswrapper[4029]: I0319 11:51:19.722960 4029 csr.go:261] certificate signing request csr-bh6f9 is approved, waiting to be issued Mar 19 11:51:19.799332 master-0 kubenswrapper[4029]: I0319 11:51:19.799149 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:19.800385 master-0 kubenswrapper[4029]: I0319 11:51:19.800347 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:19.800385 master-0 kubenswrapper[4029]: I0319 11:51:19.800385 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:19.800509 master-0 kubenswrapper[4029]: I0319 11:51:19.800399 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:19.804353 master-0 kubenswrapper[4029]: I0319 11:51:19.804320 4029 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:51:19.898988 master-0 kubenswrapper[4029]: E0319 11:51:19.898242 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e3bce589e7c5a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bce589e7c5a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:00.666637402 +0000 UTC m=+5.743513969,LastTimestamp:2026-03-19 11:51:19.887698275 +0000 UTC m=+24.964574842,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:19.916330 master-0 kubenswrapper[4029]: E0319 11:51:19.915879 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e3bce72b71e4a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bce72b71e4a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:01.104459338 +0000 UTC m=+6.181335905,LastTimestamp:2026-03-19 11:51:19.90709604 +0000 UTC m=+24.983972617,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:20.512508 master-0 kubenswrapper[4029]: I0319 11:51:20.512417 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:51:20.803310 master-0 kubenswrapper[4029]: I0319 11:51:20.803112 4029 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 19 11:51:20.804314 master-0 kubenswrapper[4029]: I0319 11:51:20.803587 4029 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/1.log" Mar 19 11:51:20.804314 master-0 kubenswrapper[4029]: I0319 11:51:20.803922 4029 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="d4ec9f8652caf61956bb350585a200ee75b716b204eab89e8110dd9c8c54f2a5" exitCode=1 Mar 19 11:51:20.804314 master-0 kubenswrapper[4029]: I0319 11:51:20.804027 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:20.804314 master-0 kubenswrapper[4029]: I0319 11:51:20.804016 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"d4ec9f8652caf61956bb350585a200ee75b716b204eab89e8110dd9c8c54f2a5"} Mar 19 11:51:20.804314 master-0 kubenswrapper[4029]: I0319 11:51:20.804078 4029 scope.go:117] "RemoveContainer" containerID="8bd372e238e986e0984170cb681e728cd33a0ebd43bfa74f52c87a36d11bdbf9" Mar 19 11:51:20.804314 master-0 kubenswrapper[4029]: I0319 11:51:20.804269 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:20.804851 master-0 kubenswrapper[4029]: I0319 11:51:20.804798 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:20.804851 master-0 kubenswrapper[4029]: I0319 11:51:20.804826 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:20.804851 master-0 kubenswrapper[4029]: I0319 11:51:20.804837 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:20.805879 master-0 kubenswrapper[4029]: I0319 11:51:20.805564 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:20.805879 master-0 kubenswrapper[4029]: I0319 11:51:20.805595 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:20.805879 master-0 kubenswrapper[4029]: I0319 11:51:20.805606 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:20.805879 master-0 kubenswrapper[4029]: I0319 11:51:20.805845 4029 scope.go:117] "RemoveContainer" containerID="d4ec9f8652caf61956bb350585a200ee75b716b204eab89e8110dd9c8c54f2a5" Mar 19 11:51:20.806307 master-0 kubenswrapper[4029]: E0319 11:51:20.805966 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="1249822f86f23526277d165c0d5d3c19" Mar 19 11:51:20.812900 master-0 kubenswrapper[4029]: E0319 11:51:20.812599 4029 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e3bcf0fc8f2d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bcf0fc8f2d6 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:03.739650774 +0000 UTC m=+8.816527341,LastTimestamp:2026-03-19 11:51:20.805942784 +0000 UTC m=+25.882819351,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:21.512402 master-0 kubenswrapper[4029]: I0319 11:51:21.512331 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:51:21.809205 master-0 kubenswrapper[4029]: I0319 11:51:21.809024 4029 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 19 11:51:21.809713 master-0 kubenswrapper[4029]: I0319 11:51:21.809568 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:21.811245 master-0 kubenswrapper[4029]: I0319 11:51:21.811180 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:21.811298 master-0 kubenswrapper[4029]: I0319 11:51:21.811282 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:21.811338 master-0 kubenswrapper[4029]: I0319 11:51:21.811312 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:22.513237 master-0 kubenswrapper[4029]: I0319 11:51:22.512867 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:51:23.482301 master-0 kubenswrapper[4029]: I0319 11:51:23.482199 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:23.483411 master-0 kubenswrapper[4029]: I0319 11:51:23.483384 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:23.483468 master-0 kubenswrapper[4029]: I0319 11:51:23.483422 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:23.483468 master-0 kubenswrapper[4029]: I0319 11:51:23.483435 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:23.483526 master-0 kubenswrapper[4029]: I0319 11:51:23.483495 4029 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:51:23.487721 master-0 kubenswrapper[4029]: E0319 11:51:23.487672 4029 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 19 11:51:23.487839 master-0 kubenswrapper[4029]: E0319 11:51:23.487787 4029 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 11:51:23.507050 master-0 kubenswrapper[4029]: I0319 11:51:23.506987 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:51:24.511912 master-0 kubenswrapper[4029]: I0319 11:51:24.511755 4029 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:51:25.103683 master-0 kubenswrapper[4029]: I0319 11:51:25.103620 4029 csr.go:257] certificate signing request csr-bh6f9 is issued Mar 19 11:51:25.385145 master-0 kubenswrapper[4029]: I0319 11:51:25.385028 4029 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 19 11:51:25.515479 master-0 kubenswrapper[4029]: I0319 11:51:25.515380 4029 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:51:25.531430 master-0 kubenswrapper[4029]: I0319 11:51:25.531385 4029 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:51:25.590968 master-0 kubenswrapper[4029]: I0319 11:51:25.590909 4029 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:51:25.633611 master-0 kubenswrapper[4029]: E0319 11:51:25.633561 4029 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 11:51:25.850760 master-0 kubenswrapper[4029]: I0319 11:51:25.850697 4029 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:51:25.850760 master-0 kubenswrapper[4029]: E0319 11:51:25.850761 4029 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 19 11:51:25.870941 master-0 kubenswrapper[4029]: I0319 11:51:25.870874 4029 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:51:25.887157 master-0 kubenswrapper[4029]: I0319 11:51:25.887097 4029 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:51:25.944068 master-0 kubenswrapper[4029]: I0319 11:51:25.943981 4029 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:51:26.105411 master-0 kubenswrapper[4029]: I0319 11:51:26.105229 4029 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-20 11:43:03 +0000 UTC, rotation deadline is 2026-03-20 05:48:34.525483252 +0000 UTC Mar 19 11:51:26.105411 master-0 kubenswrapper[4029]: I0319 11:51:26.105312 4029 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 17h57m8.420179993s for next certificate rotation Mar 19 11:51:26.221836 master-0 kubenswrapper[4029]: I0319 11:51:26.221782 4029 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:51:26.221836 master-0 kubenswrapper[4029]: E0319 11:51:26.221821 4029 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 19 11:51:26.329403 master-0 kubenswrapper[4029]: I0319 11:51:26.329336 4029 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:51:26.348277 master-0 kubenswrapper[4029]: I0319 11:51:26.348195 4029 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:51:26.404876 master-0 kubenswrapper[4029]: I0319 11:51:26.404717 4029 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:51:26.662463 master-0 kubenswrapper[4029]: I0319 11:51:26.662306 4029 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:51:26.662463 master-0 kubenswrapper[4029]: E0319 11:51:26.662351 4029 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 19 11:51:26.856840 master-0 kubenswrapper[4029]: I0319 11:51:26.856780 4029 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 11:51:27.245090 master-0 kubenswrapper[4029]: I0319 11:51:27.245015 4029 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:51:27.259114 master-0 kubenswrapper[4029]: I0319 11:51:27.259020 4029 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:51:27.314123 master-0 kubenswrapper[4029]: I0319 11:51:27.314062 4029 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:51:27.572426 master-0 kubenswrapper[4029]: I0319 11:51:27.572242 4029 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:51:27.572426 master-0 kubenswrapper[4029]: E0319 11:51:27.572312 4029 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 19 11:51:27.662535 master-0 kubenswrapper[4029]: I0319 11:51:27.662435 4029 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 11:51:30.488579 master-0 kubenswrapper[4029]: I0319 11:51:30.488458 4029 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:30.490480 master-0 kubenswrapper[4029]: I0319 11:51:30.490357 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:30.490480 master-0 kubenswrapper[4029]: I0319 11:51:30.490424 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:30.490480 master-0 kubenswrapper[4029]: I0319 11:51:30.490447 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:30.491426 master-0 kubenswrapper[4029]: I0319 11:51:30.490536 4029 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:51:30.495928 master-0 kubenswrapper[4029]: E0319 11:51:30.495868 4029 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"master-0\" not found" node="master-0" Mar 19 11:51:30.505387 master-0 kubenswrapper[4029]: I0319 11:51:30.505306 4029 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 19 11:51:30.505387 master-0 kubenswrapper[4029]: E0319 11:51:30.505398 4029 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Mar 19 11:51:30.520550 master-0 kubenswrapper[4029]: E0319 11:51:30.520469 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:30.621821 master-0 kubenswrapper[4029]: E0319 11:51:30.621670 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:30.678803 master-0 kubenswrapper[4029]: I0319 11:51:30.678721 4029 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 19 11:51:30.689420 master-0 kubenswrapper[4029]: I0319 11:51:30.689361 4029 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 19 11:51:30.723006 master-0 kubenswrapper[4029]: E0319 11:51:30.722935 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:30.823839 master-0 kubenswrapper[4029]: E0319 11:51:30.823671 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:30.924028 master-0 kubenswrapper[4029]: E0319 11:51:30.923916 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:31.025128 master-0 kubenswrapper[4029]: E0319 11:51:31.025040 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:31.126144 master-0 kubenswrapper[4029]: E0319 11:51:31.125936 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:31.226840 master-0 kubenswrapper[4029]: E0319 11:51:31.226721 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:31.327691 master-0 kubenswrapper[4029]: E0319 11:51:31.327622 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:31.428963 master-0 kubenswrapper[4029]: E0319 11:51:31.428759 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:31.529250 master-0 kubenswrapper[4029]: E0319 11:51:31.529149 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:31.629878 master-0 kubenswrapper[4029]: E0319 11:51:31.629808 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:31.730398 master-0 kubenswrapper[4029]: E0319 11:51:31.730331 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:31.831346 master-0 kubenswrapper[4029]: E0319 11:51:31.831270 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:31.931872 master-0 kubenswrapper[4029]: E0319 11:51:31.931760 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:32.033102 master-0 kubenswrapper[4029]: E0319 11:51:32.032868 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:32.133835 master-0 kubenswrapper[4029]: E0319 11:51:32.133687 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:32.234977 master-0 kubenswrapper[4029]: E0319 11:51:32.234899 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:32.335490 master-0 kubenswrapper[4029]: E0319 11:51:32.335332 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:32.435761 master-0 kubenswrapper[4029]: E0319 11:51:32.435676 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:32.535907 master-0 kubenswrapper[4029]: E0319 11:51:32.535833 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:32.636616 master-0 kubenswrapper[4029]: E0319 11:51:32.636403 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:32.736825 master-0 kubenswrapper[4029]: E0319 11:51:32.736754 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:32.837141 master-0 kubenswrapper[4029]: E0319 11:51:32.837080 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:32.938056 master-0 kubenswrapper[4029]: E0319 11:51:32.937980 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:33.039039 master-0 kubenswrapper[4029]: E0319 11:51:33.038899 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:33.139684 master-0 kubenswrapper[4029]: E0319 11:51:33.139567 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:33.241000 master-0 kubenswrapper[4029]: E0319 11:51:33.240578 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:33.341547 master-0 kubenswrapper[4029]: E0319 11:51:33.341438 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:33.441939 master-0 kubenswrapper[4029]: E0319 11:51:33.441811 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:33.542801 master-0 kubenswrapper[4029]: E0319 11:51:33.542524 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:33.643654 master-0 kubenswrapper[4029]: E0319 11:51:33.643503 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:33.743955 master-0 kubenswrapper[4029]: E0319 11:51:33.743771 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:33.844973 master-0 kubenswrapper[4029]: E0319 11:51:33.844787 4029 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:33.848412 master-0 kubenswrapper[4029]: I0319 11:51:33.848360 4029 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 11:51:34.513900 master-0 kubenswrapper[4029]: I0319 11:51:34.513802 4029 apiserver.go:52] "Watching apiserver" Mar 19 11:51:34.518395 master-0 kubenswrapper[4029]: I0319 11:51:34.518344 4029 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 11:51:34.518907 master-0 kubenswrapper[4029]: I0319 11:51:34.518819 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-56d8475767-pk574","openshift-network-operator/network-operator-7bd846bfc4-7fz6w"] Mar 19 11:51:34.519529 master-0 kubenswrapper[4029]: I0319 11:51:34.519469 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" Mar 19 11:51:34.519665 master-0 kubenswrapper[4029]: I0319 11:51:34.519478 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:51:34.521877 master-0 kubenswrapper[4029]: I0319 11:51:34.521836 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 11:51:34.522468 master-0 kubenswrapper[4029]: I0319 11:51:34.522379 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 11:51:34.523431 master-0 kubenswrapper[4029]: I0319 11:51:34.523388 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 11:51:34.524155 master-0 kubenswrapper[4029]: I0319 11:51:34.524100 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 11:51:34.524854 master-0 kubenswrapper[4029]: I0319 11:51:34.524806 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 11:51:34.525637 master-0 kubenswrapper[4029]: I0319 11:51:34.525590 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 11:51:34.611366 master-0 kubenswrapper[4029]: I0319 11:51:34.611286 4029 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 19 11:51:34.662032 master-0 kubenswrapper[4029]: I0319 11:51:34.661963 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["assisted-installer/assisted-installer-controller-48bcp"] Mar 19 11:51:34.662377 master-0 kubenswrapper[4029]: I0319 11:51:34.662316 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-48bcp" Mar 19 11:51:34.664956 master-0 kubenswrapper[4029]: I0319 11:51:34.664891 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"assisted-installer-controller-config" Mar 19 11:51:34.665646 master-0 kubenswrapper[4029]: I0319 11:51:34.665627 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"openshift-service-ca.crt" Mar 19 11:51:34.665879 master-0 kubenswrapper[4029]: I0319 11:51:34.665833 4029 reflector.go:368] Caches populated for *v1.Secret from object-"assisted-installer"/"assisted-installer-controller-secret" Mar 19 11:51:34.665978 master-0 kubenswrapper[4029]: I0319 11:51:34.665887 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"kube-root-ca.crt" Mar 19 11:51:34.688031 master-0 kubenswrapper[4029]: I0319 11:51:34.687925 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/333047c4-aeca-410e-9393-ca4e74366921-service-ca\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:51:34.688031 master-0 kubenswrapper[4029]: I0319 11:51:34.688022 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:51:34.688413 master-0 kubenswrapper[4029]: I0319 11:51:34.688139 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/333047c4-aeca-410e-9393-ca4e74366921-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:51:34.688413 master-0 kubenswrapper[4029]: I0319 11:51:34.688234 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/333047c4-aeca-410e-9393-ca4e74366921-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:51:34.688413 master-0 kubenswrapper[4029]: I0319 11:51:34.688278 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/333047c4-aeca-410e-9393-ca4e74366921-kube-api-access\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:51:34.688413 master-0 kubenswrapper[4029]: I0319 11:51:34.688387 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3c3b0d24-ce5e-49c3-a546-874356f75dc6-host-etc-kube\") pod \"network-operator-7bd846bfc4-7fz6w\" (UID: \"3c3b0d24-ce5e-49c3-a546-874356f75dc6\") " pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" Mar 19 11:51:34.688692 master-0 kubenswrapper[4029]: I0319 11:51:34.688432 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3c3b0d24-ce5e-49c3-a546-874356f75dc6-metrics-tls\") pod \"network-operator-7bd846bfc4-7fz6w\" (UID: \"3c3b0d24-ce5e-49c3-a546-874356f75dc6\") " pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" Mar 19 11:51:34.688692 master-0 kubenswrapper[4029]: I0319 11:51:34.688456 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pngsr\" (UniqueName: \"kubernetes.io/projected/3c3b0d24-ce5e-49c3-a546-874356f75dc6-kube-api-access-pngsr\") pod \"network-operator-7bd846bfc4-7fz6w\" (UID: \"3c3b0d24-ce5e-49c3-a546-874356f75dc6\") " pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" Mar 19 11:51:34.789472 master-0 kubenswrapper[4029]: I0319 11:51:34.789250 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/333047c4-aeca-410e-9393-ca4e74366921-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:51:34.789472 master-0 kubenswrapper[4029]: I0319 11:51:34.789341 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-host-ca-bundle\") pod \"assisted-installer-controller-48bcp\" (UID: \"c13ffb3e-ab50-411c-9208-7ba47e8ebc92\") " pod="assisted-installer/assisted-installer-controller-48bcp" Mar 19 11:51:34.789472 master-0 kubenswrapper[4029]: I0319 11:51:34.789426 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/333047c4-aeca-410e-9393-ca4e74366921-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:51:34.790016 master-0 kubenswrapper[4029]: I0319 11:51:34.789517 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-sno-bootstrap-files\") pod \"assisted-installer-controller-48bcp\" (UID: \"c13ffb3e-ab50-411c-9208-7ba47e8ebc92\") " pod="assisted-installer/assisted-installer-controller-48bcp" Mar 19 11:51:34.790016 master-0 kubenswrapper[4029]: I0319 11:51:34.789609 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/333047c4-aeca-410e-9393-ca4e74366921-kube-api-access\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:51:34.790016 master-0 kubenswrapper[4029]: I0319 11:51:34.789650 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/333047c4-aeca-410e-9393-ca4e74366921-service-ca\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:51:34.790016 master-0 kubenswrapper[4029]: I0319 11:51:34.789697 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-host-var-run-resolv-conf\") pod \"assisted-installer-controller-48bcp\" (UID: \"c13ffb3e-ab50-411c-9208-7ba47e8ebc92\") " pod="assisted-installer/assisted-installer-controller-48bcp" Mar 19 11:51:34.790016 master-0 kubenswrapper[4029]: I0319 11:51:34.789950 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/333047c4-aeca-410e-9393-ca4e74366921-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:51:34.790016 master-0 kubenswrapper[4029]: I0319 11:51:34.790029 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3c3b0d24-ce5e-49c3-a546-874356f75dc6-host-etc-kube\") pod \"network-operator-7bd846bfc4-7fz6w\" (UID: \"3c3b0d24-ce5e-49c3-a546-874356f75dc6\") " pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" Mar 19 11:51:34.790479 master-0 kubenswrapper[4029]: I0319 11:51:34.790107 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3c3b0d24-ce5e-49c3-a546-874356f75dc6-metrics-tls\") pod \"network-operator-7bd846bfc4-7fz6w\" (UID: \"3c3b0d24-ce5e-49c3-a546-874356f75dc6\") " pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" Mar 19 11:51:34.790479 master-0 kubenswrapper[4029]: I0319 11:51:34.790129 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pngsr\" (UniqueName: \"kubernetes.io/projected/3c3b0d24-ce5e-49c3-a546-874356f75dc6-kube-api-access-pngsr\") pod \"network-operator-7bd846bfc4-7fz6w\" (UID: \"3c3b0d24-ce5e-49c3-a546-874356f75dc6\") " pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" Mar 19 11:51:34.790479 master-0 kubenswrapper[4029]: I0319 11:51:34.790150 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpc9n\" (UniqueName: \"kubernetes.io/projected/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-kube-api-access-bpc9n\") pod \"assisted-installer-controller-48bcp\" (UID: \"c13ffb3e-ab50-411c-9208-7ba47e8ebc92\") " pod="assisted-installer/assisted-installer-controller-48bcp" Mar 19 11:51:34.790479 master-0 kubenswrapper[4029]: I0319 11:51:34.790099 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/333047c4-aeca-410e-9393-ca4e74366921-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:51:34.790479 master-0 kubenswrapper[4029]: I0319 11:51:34.790184 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:51:34.790479 master-0 kubenswrapper[4029]: I0319 11:51:34.790237 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-host-resolv-conf\") pod \"assisted-installer-controller-48bcp\" (UID: \"c13ffb3e-ab50-411c-9208-7ba47e8ebc92\") " pod="assisted-installer/assisted-installer-controller-48bcp" Mar 19 11:51:34.790479 master-0 kubenswrapper[4029]: E0319 11:51:34.790285 4029 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:51:34.790479 master-0 kubenswrapper[4029]: I0319 11:51:34.790336 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3c3b0d24-ce5e-49c3-a546-874356f75dc6-host-etc-kube\") pod \"network-operator-7bd846bfc4-7fz6w\" (UID: \"3c3b0d24-ce5e-49c3-a546-874356f75dc6\") " pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" Mar 19 11:51:34.790479 master-0 kubenswrapper[4029]: E0319 11:51:34.790422 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert podName:333047c4-aeca-410e-9393-ca4e74366921 nodeName:}" failed. No retries permitted until 2026-03-19 11:51:35.290394397 +0000 UTC m=+40.367270974 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert") pod "cluster-version-operator-56d8475767-pk574" (UID: "333047c4-aeca-410e-9393-ca4e74366921") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:51:34.791362 master-0 kubenswrapper[4029]: I0319 11:51:34.790825 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/333047c4-aeca-410e-9393-ca4e74366921-service-ca\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:51:34.793286 master-0 kubenswrapper[4029]: I0319 11:51:34.793200 4029 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 19 11:51:34.802140 master-0 kubenswrapper[4029]: I0319 11:51:34.802046 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3c3b0d24-ce5e-49c3-a546-874356f75dc6-metrics-tls\") pod \"network-operator-7bd846bfc4-7fz6w\" (UID: \"3c3b0d24-ce5e-49c3-a546-874356f75dc6\") " pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" Mar 19 11:51:34.812152 master-0 kubenswrapper[4029]: I0319 11:51:34.812100 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pngsr\" (UniqueName: \"kubernetes.io/projected/3c3b0d24-ce5e-49c3-a546-874356f75dc6-kube-api-access-pngsr\") pod \"network-operator-7bd846bfc4-7fz6w\" (UID: \"3c3b0d24-ce5e-49c3-a546-874356f75dc6\") " pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" Mar 19 11:51:34.816099 master-0 kubenswrapper[4029]: I0319 11:51:34.816045 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/333047c4-aeca-410e-9393-ca4e74366921-kube-api-access\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:51:34.842583 master-0 kubenswrapper[4029]: I0319 11:51:34.842500 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" Mar 19 11:51:34.853306 master-0 kubenswrapper[4029]: W0319 11:51:34.853242 4029 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c3b0d24_ce5e_49c3_a546_874356f75dc6.slice/crio-a387a6fb603981d31a2529e0731ac72c41f84be90202777248f07296f1eb9d6b WatchSource:0}: Error finding container a387a6fb603981d31a2529e0731ac72c41f84be90202777248f07296f1eb9d6b: Status 404 returned error can't find the container with id a387a6fb603981d31a2529e0731ac72c41f84be90202777248f07296f1eb9d6b Mar 19 11:51:34.890954 master-0 kubenswrapper[4029]: I0319 11:51:34.890858 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpc9n\" (UniqueName: \"kubernetes.io/projected/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-kube-api-access-bpc9n\") pod \"assisted-installer-controller-48bcp\" (UID: \"c13ffb3e-ab50-411c-9208-7ba47e8ebc92\") " pod="assisted-installer/assisted-installer-controller-48bcp" Mar 19 11:51:34.890954 master-0 kubenswrapper[4029]: I0319 11:51:34.890940 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-host-resolv-conf\") pod \"assisted-installer-controller-48bcp\" (UID: \"c13ffb3e-ab50-411c-9208-7ba47e8ebc92\") " pod="assisted-installer/assisted-installer-controller-48bcp" Mar 19 11:51:34.891140 master-0 kubenswrapper[4029]: I0319 11:51:34.891021 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-host-resolv-conf\") pod \"assisted-installer-controller-48bcp\" (UID: \"c13ffb3e-ab50-411c-9208-7ba47e8ebc92\") " pod="assisted-installer/assisted-installer-controller-48bcp" Mar 19 11:51:34.891140 master-0 kubenswrapper[4029]: I0319 11:51:34.891087 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-host-ca-bundle\") pod \"assisted-installer-controller-48bcp\" (UID: \"c13ffb3e-ab50-411c-9208-7ba47e8ebc92\") " pod="assisted-installer/assisted-installer-controller-48bcp" Mar 19 11:51:34.891140 master-0 kubenswrapper[4029]: I0319 11:51:34.891104 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-host-ca-bundle\") pod \"assisted-installer-controller-48bcp\" (UID: \"c13ffb3e-ab50-411c-9208-7ba47e8ebc92\") " pod="assisted-installer/assisted-installer-controller-48bcp" Mar 19 11:51:34.891140 master-0 kubenswrapper[4029]: I0319 11:51:34.891128 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-sno-bootstrap-files\") pod \"assisted-installer-controller-48bcp\" (UID: \"c13ffb3e-ab50-411c-9208-7ba47e8ebc92\") " pod="assisted-installer/assisted-installer-controller-48bcp" Mar 19 11:51:34.891301 master-0 kubenswrapper[4029]: I0319 11:51:34.891181 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-sno-bootstrap-files\") pod \"assisted-installer-controller-48bcp\" (UID: \"c13ffb3e-ab50-411c-9208-7ba47e8ebc92\") " pod="assisted-installer/assisted-installer-controller-48bcp" Mar 19 11:51:34.891301 master-0 kubenswrapper[4029]: I0319 11:51:34.891198 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-host-var-run-resolv-conf\") pod \"assisted-installer-controller-48bcp\" (UID: \"c13ffb3e-ab50-411c-9208-7ba47e8ebc92\") " pod="assisted-installer/assisted-installer-controller-48bcp" Mar 19 11:51:34.891301 master-0 kubenswrapper[4029]: I0319 11:51:34.891230 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-host-var-run-resolv-conf\") pod \"assisted-installer-controller-48bcp\" (UID: \"c13ffb3e-ab50-411c-9208-7ba47e8ebc92\") " pod="assisted-installer/assisted-installer-controller-48bcp" Mar 19 11:51:34.911788 master-0 kubenswrapper[4029]: I0319 11:51:34.911701 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpc9n\" (UniqueName: \"kubernetes.io/projected/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-kube-api-access-bpc9n\") pod \"assisted-installer-controller-48bcp\" (UID: \"c13ffb3e-ab50-411c-9208-7ba47e8ebc92\") " pod="assisted-installer/assisted-installer-controller-48bcp" Mar 19 11:51:34.990115 master-0 kubenswrapper[4029]: I0319 11:51:34.990004 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-48bcp" Mar 19 11:51:35.000360 master-0 kubenswrapper[4029]: W0319 11:51:35.000307 4029 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc13ffb3e_ab50_411c_9208_7ba47e8ebc92.slice/crio-19faef5336e0e62090140de4619f79a9e64f33712b5b8e70590e04d8b85ea93f WatchSource:0}: Error finding container 19faef5336e0e62090140de4619f79a9e64f33712b5b8e70590e04d8b85ea93f: Status 404 returned error can't find the container with id 19faef5336e0e62090140de4619f79a9e64f33712b5b8e70590e04d8b85ea93f Mar 19 11:51:35.293687 master-0 kubenswrapper[4029]: I0319 11:51:35.293557 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:51:35.293687 master-0 kubenswrapper[4029]: E0319 11:51:35.293699 4029 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:51:35.294136 master-0 kubenswrapper[4029]: E0319 11:51:35.293784 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert podName:333047c4-aeca-410e-9393-ca4e74366921 nodeName:}" failed. No retries permitted until 2026-03-19 11:51:36.293767796 +0000 UTC m=+41.370644363 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert") pod "cluster-version-operator-56d8475767-pk574" (UID: "333047c4-aeca-410e-9393-ca4e74366921") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:51:35.669765 master-0 kubenswrapper[4029]: I0319 11:51:35.669607 4029 scope.go:117] "RemoveContainer" containerID="d4ec9f8652caf61956bb350585a200ee75b716b204eab89e8110dd9c8c54f2a5" Mar 19 11:51:35.669765 master-0 kubenswrapper[4029]: I0319 11:51:35.669630 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0"] Mar 19 11:51:35.670605 master-0 kubenswrapper[4029]: E0319 11:51:35.669901 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="1249822f86f23526277d165c0d5d3c19" Mar 19 11:51:35.843477 master-0 kubenswrapper[4029]: I0319 11:51:35.843407 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-48bcp" event={"ID":"c13ffb3e-ab50-411c-9208-7ba47e8ebc92","Type":"ContainerStarted","Data":"19faef5336e0e62090140de4619f79a9e64f33712b5b8e70590e04d8b85ea93f"} Mar 19 11:51:35.844232 master-0 kubenswrapper[4029]: I0319 11:51:35.844171 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" event={"ID":"3c3b0d24-ce5e-49c3-a546-874356f75dc6","Type":"ContainerStarted","Data":"a387a6fb603981d31a2529e0731ac72c41f84be90202777248f07296f1eb9d6b"} Mar 19 11:51:35.844527 master-0 kubenswrapper[4029]: I0319 11:51:35.844495 4029 scope.go:117] "RemoveContainer" containerID="d4ec9f8652caf61956bb350585a200ee75b716b204eab89e8110dd9c8c54f2a5" Mar 19 11:51:35.844669 master-0 kubenswrapper[4029]: E0319 11:51:35.844638 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="1249822f86f23526277d165c0d5d3c19" Mar 19 11:51:36.089165 master-0 kubenswrapper[4029]: I0319 11:51:36.089109 4029 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 11:51:36.300780 master-0 kubenswrapper[4029]: I0319 11:51:36.300504 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:51:36.301127 master-0 kubenswrapper[4029]: E0319 11:51:36.300868 4029 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:51:36.301127 master-0 kubenswrapper[4029]: E0319 11:51:36.300937 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert podName:333047c4-aeca-410e-9393-ca4e74366921 nodeName:}" failed. No retries permitted until 2026-03-19 11:51:38.300919672 +0000 UTC m=+43.377796239 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert") pod "cluster-version-operator-56d8475767-pk574" (UID: "333047c4-aeca-410e-9393-ca4e74366921") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:51:38.316040 master-0 kubenswrapper[4029]: I0319 11:51:38.315954 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:51:38.316589 master-0 kubenswrapper[4029]: E0319 11:51:38.316108 4029 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:51:38.316589 master-0 kubenswrapper[4029]: E0319 11:51:38.316168 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert podName:333047c4-aeca-410e-9393-ca4e74366921 nodeName:}" failed. No retries permitted until 2026-03-19 11:51:42.316153703 +0000 UTC m=+47.393030270 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert") pod "cluster-version-operator-56d8475767-pk574" (UID: "333047c4-aeca-410e-9393-ca4e74366921") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:51:40.428532 master-0 kubenswrapper[4029]: I0319 11:51:40.428412 4029 csr.go:261] certificate signing request csr-qvgw2 is approved, waiting to be issued Mar 19 11:51:40.432414 master-0 kubenswrapper[4029]: I0319 11:51:40.432385 4029 csr.go:257] certificate signing request csr-qvgw2 is issued Mar 19 11:51:40.856102 master-0 kubenswrapper[4029]: I0319 11:51:40.856020 4029 generic.go:334] "Generic (PLEG): container finished" podID="c13ffb3e-ab50-411c-9208-7ba47e8ebc92" containerID="d0afa60868b67a2bbb33777d6af8334fc696accf5659fb55479d8c7b865f745d" exitCode=0 Mar 19 11:51:40.856347 master-0 kubenswrapper[4029]: I0319 11:51:40.856119 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-48bcp" event={"ID":"c13ffb3e-ab50-411c-9208-7ba47e8ebc92","Type":"ContainerDied","Data":"d0afa60868b67a2bbb33777d6af8334fc696accf5659fb55479d8c7b865f745d"} Mar 19 11:51:40.857515 master-0 kubenswrapper[4029]: I0319 11:51:40.857464 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" event={"ID":"3c3b0d24-ce5e-49c3-a546-874356f75dc6","Type":"ContainerStarted","Data":"a35a4f30770261f78e16c8cbde80e6ad1d01d59985d717446c5cf700c3ca0a3e"} Mar 19 11:51:41.433849 master-0 kubenswrapper[4029]: I0319 11:51:41.433703 4029 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-20 11:43:03 +0000 UTC, rotation deadline is 2026-03-20 06:37:09.826901241 +0000 UTC Mar 19 11:51:41.433849 master-0 kubenswrapper[4029]: I0319 11:51:41.433807 4029 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 18h45m28.393098412s for next certificate rotation Mar 19 11:51:41.875986 master-0 kubenswrapper[4029]: I0319 11:51:41.875920 4029 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-48bcp" Mar 19 11:51:41.892015 master-0 kubenswrapper[4029]: I0319 11:51:41.891936 4029 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" podStartSLOduration=4.750761838 podStartE2EDuration="9.891919772s" podCreationTimestamp="2026-03-19 11:51:32 +0000 UTC" firstStartedPulling="2026-03-19 11:51:34.85647975 +0000 UTC m=+39.933356317" lastFinishedPulling="2026-03-19 11:51:39.997637684 +0000 UTC m=+45.074514251" observedRunningTime="2026-03-19 11:51:40.91300531 +0000 UTC m=+45.989881877" watchObservedRunningTime="2026-03-19 11:51:41.891919772 +0000 UTC m=+46.968796339" Mar 19 11:51:42.041506 master-0 kubenswrapper[4029]: I0319 11:51:42.041420 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-sno-bootstrap-files\") pod \"c13ffb3e-ab50-411c-9208-7ba47e8ebc92\" (UID: \"c13ffb3e-ab50-411c-9208-7ba47e8ebc92\") " Mar 19 11:51:42.041506 master-0 kubenswrapper[4029]: I0319 11:51:42.041488 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpc9n\" (UniqueName: \"kubernetes.io/projected/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-kube-api-access-bpc9n\") pod \"c13ffb3e-ab50-411c-9208-7ba47e8ebc92\" (UID: \"c13ffb3e-ab50-411c-9208-7ba47e8ebc92\") " Mar 19 11:51:42.041934 master-0 kubenswrapper[4029]: I0319 11:51:42.041534 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-host-ca-bundle\") pod \"c13ffb3e-ab50-411c-9208-7ba47e8ebc92\" (UID: \"c13ffb3e-ab50-411c-9208-7ba47e8ebc92\") " Mar 19 11:51:42.041934 master-0 kubenswrapper[4029]: I0319 11:51:42.041559 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-host-var-run-resolv-conf\") pod \"c13ffb3e-ab50-411c-9208-7ba47e8ebc92\" (UID: \"c13ffb3e-ab50-411c-9208-7ba47e8ebc92\") " Mar 19 11:51:42.041934 master-0 kubenswrapper[4029]: I0319 11:51:42.041579 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-host-resolv-conf\") pod \"c13ffb3e-ab50-411c-9208-7ba47e8ebc92\" (UID: \"c13ffb3e-ab50-411c-9208-7ba47e8ebc92\") " Mar 19 11:51:42.041934 master-0 kubenswrapper[4029]: I0319 11:51:42.041433 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-sno-bootstrap-files" (OuterVolumeSpecName: "sno-bootstrap-files") pod "c13ffb3e-ab50-411c-9208-7ba47e8ebc92" (UID: "c13ffb3e-ab50-411c-9208-7ba47e8ebc92"). InnerVolumeSpecName "sno-bootstrap-files". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:51:42.041934 master-0 kubenswrapper[4029]: I0319 11:51:42.041623 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-host-ca-bundle" (OuterVolumeSpecName: "host-ca-bundle") pod "c13ffb3e-ab50-411c-9208-7ba47e8ebc92" (UID: "c13ffb3e-ab50-411c-9208-7ba47e8ebc92"). InnerVolumeSpecName "host-ca-bundle". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:51:42.041934 master-0 kubenswrapper[4029]: I0319 11:51:42.041749 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-host-resolv-conf" (OuterVolumeSpecName: "host-resolv-conf") pod "c13ffb3e-ab50-411c-9208-7ba47e8ebc92" (UID: "c13ffb3e-ab50-411c-9208-7ba47e8ebc92"). InnerVolumeSpecName "host-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:51:42.041934 master-0 kubenswrapper[4029]: I0319 11:51:42.041772 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-host-var-run-resolv-conf" (OuterVolumeSpecName: "host-var-run-resolv-conf") pod "c13ffb3e-ab50-411c-9208-7ba47e8ebc92" (UID: "c13ffb3e-ab50-411c-9208-7ba47e8ebc92"). InnerVolumeSpecName "host-var-run-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:51:42.045360 master-0 kubenswrapper[4029]: I0319 11:51:42.045313 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-kube-api-access-bpc9n" (OuterVolumeSpecName: "kube-api-access-bpc9n") pod "c13ffb3e-ab50-411c-9208-7ba47e8ebc92" (UID: "c13ffb3e-ab50-411c-9208-7ba47e8ebc92"). InnerVolumeSpecName "kube-api-access-bpc9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:51:42.142796 master-0 kubenswrapper[4029]: I0319 11:51:42.142506 4029 reconciler_common.go:293] "Volume detached for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-host-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 11:51:42.142796 master-0 kubenswrapper[4029]: I0319 11:51:42.142540 4029 reconciler_common.go:293] "Volume detached for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-host-var-run-resolv-conf\") on node \"master-0\" DevicePath \"\"" Mar 19 11:51:42.142796 master-0 kubenswrapper[4029]: I0319 11:51:42.142555 4029 reconciler_common.go:293] "Volume detached for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-host-resolv-conf\") on node \"master-0\" DevicePath \"\"" Mar 19 11:51:42.142796 master-0 kubenswrapper[4029]: I0319 11:51:42.142564 4029 reconciler_common.go:293] "Volume detached for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-sno-bootstrap-files\") on node \"master-0\" DevicePath \"\"" Mar 19 11:51:42.142796 master-0 kubenswrapper[4029]: I0319 11:51:42.142573 4029 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpc9n\" (UniqueName: \"kubernetes.io/projected/c13ffb3e-ab50-411c-9208-7ba47e8ebc92-kube-api-access-bpc9n\") on node \"master-0\" DevicePath \"\"" Mar 19 11:51:42.344789 master-0 kubenswrapper[4029]: I0319 11:51:42.344627 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:51:42.344789 master-0 kubenswrapper[4029]: E0319 11:51:42.344807 4029 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:51:42.345231 master-0 kubenswrapper[4029]: E0319 11:51:42.344881 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert podName:333047c4-aeca-410e-9393-ca4e74366921 nodeName:}" failed. No retries permitted until 2026-03-19 11:51:50.344859899 +0000 UTC m=+55.421736466 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert") pod "cluster-version-operator-56d8475767-pk574" (UID: "333047c4-aeca-410e-9393-ca4e74366921") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:51:42.434903 master-0 kubenswrapper[4029]: I0319 11:51:42.434845 4029 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-20 11:43:03 +0000 UTC, rotation deadline is 2026-03-20 08:43:23.507034202 +0000 UTC Mar 19 11:51:42.434903 master-0 kubenswrapper[4029]: I0319 11:51:42.434890 4029 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 20h51m41.072147123s for next certificate rotation Mar 19 11:51:42.437373 master-0 kubenswrapper[4029]: I0319 11:51:42.437334 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/mtu-prober-vjwlk"] Mar 19 11:51:42.437453 master-0 kubenswrapper[4029]: E0319 11:51:42.437410 4029 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c13ffb3e-ab50-411c-9208-7ba47e8ebc92" containerName="assisted-installer-controller" Mar 19 11:51:42.437453 master-0 kubenswrapper[4029]: I0319 11:51:42.437421 4029 state_mem.go:107] "Deleted CPUSet assignment" podUID="c13ffb3e-ab50-411c-9208-7ba47e8ebc92" containerName="assisted-installer-controller" Mar 19 11:51:42.437453 master-0 kubenswrapper[4029]: I0319 11:51:42.437448 4029 memory_manager.go:354] "RemoveStaleState removing state" podUID="c13ffb3e-ab50-411c-9208-7ba47e8ebc92" containerName="assisted-installer-controller" Mar 19 11:51:42.437604 master-0 kubenswrapper[4029]: I0319 11:51:42.437583 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-vjwlk" Mar 19 11:51:42.545694 master-0 kubenswrapper[4029]: I0319 11:51:42.545624 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff9mf\" (UniqueName: \"kubernetes.io/projected/0121ab07-b504-4577-bb1b-fef929268726-kube-api-access-ff9mf\") pod \"mtu-prober-vjwlk\" (UID: \"0121ab07-b504-4577-bb1b-fef929268726\") " pod="openshift-network-operator/mtu-prober-vjwlk" Mar 19 11:51:42.646471 master-0 kubenswrapper[4029]: I0319 11:51:42.646397 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff9mf\" (UniqueName: \"kubernetes.io/projected/0121ab07-b504-4577-bb1b-fef929268726-kube-api-access-ff9mf\") pod \"mtu-prober-vjwlk\" (UID: \"0121ab07-b504-4577-bb1b-fef929268726\") " pod="openshift-network-operator/mtu-prober-vjwlk" Mar 19 11:51:42.661066 master-0 kubenswrapper[4029]: I0319 11:51:42.661008 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff9mf\" (UniqueName: \"kubernetes.io/projected/0121ab07-b504-4577-bb1b-fef929268726-kube-api-access-ff9mf\") pod \"mtu-prober-vjwlk\" (UID: \"0121ab07-b504-4577-bb1b-fef929268726\") " pod="openshift-network-operator/mtu-prober-vjwlk" Mar 19 11:51:42.748007 master-0 kubenswrapper[4029]: I0319 11:51:42.747885 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-vjwlk" Mar 19 11:51:42.759580 master-0 kubenswrapper[4029]: W0319 11:51:42.759527 4029 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0121ab07_b504_4577_bb1b_fef929268726.slice/crio-8fbc1c222e9490f91d6a952eefdb3b6b0e27ef8c0528bce673e1da480e2d8f19 WatchSource:0}: Error finding container 8fbc1c222e9490f91d6a952eefdb3b6b0e27ef8c0528bce673e1da480e2d8f19: Status 404 returned error can't find the container with id 8fbc1c222e9490f91d6a952eefdb3b6b0e27ef8c0528bce673e1da480e2d8f19 Mar 19 11:51:42.865452 master-0 kubenswrapper[4029]: I0319 11:51:42.865378 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-vjwlk" event={"ID":"0121ab07-b504-4577-bb1b-fef929268726","Type":"ContainerStarted","Data":"8fbc1c222e9490f91d6a952eefdb3b6b0e27ef8c0528bce673e1da480e2d8f19"} Mar 19 11:51:42.866592 master-0 kubenswrapper[4029]: I0319 11:51:42.866544 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-48bcp" event={"ID":"c13ffb3e-ab50-411c-9208-7ba47e8ebc92","Type":"ContainerDied","Data":"19faef5336e0e62090140de4619f79a9e64f33712b5b8e70590e04d8b85ea93f"} Mar 19 11:51:42.866592 master-0 kubenswrapper[4029]: I0319 11:51:42.866587 4029 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19faef5336e0e62090140de4619f79a9e64f33712b5b8e70590e04d8b85ea93f" Mar 19 11:51:42.866701 master-0 kubenswrapper[4029]: I0319 11:51:42.866598 4029 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-48bcp" Mar 19 11:51:43.871183 master-0 kubenswrapper[4029]: I0319 11:51:43.871110 4029 generic.go:334] "Generic (PLEG): container finished" podID="0121ab07-b504-4577-bb1b-fef929268726" containerID="7dae6204524503aef6defd496cb7b6d0917403d46739b0545f2e50058742fb7c" exitCode=0 Mar 19 11:51:43.871183 master-0 kubenswrapper[4029]: I0319 11:51:43.871163 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-vjwlk" event={"ID":"0121ab07-b504-4577-bb1b-fef929268726","Type":"ContainerDied","Data":"7dae6204524503aef6defd496cb7b6d0917403d46739b0545f2e50058742fb7c"} Mar 19 11:51:44.887774 master-0 kubenswrapper[4029]: I0319 11:51:44.887703 4029 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-vjwlk" Mar 19 11:51:45.064117 master-0 kubenswrapper[4029]: I0319 11:51:45.064027 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff9mf\" (UniqueName: \"kubernetes.io/projected/0121ab07-b504-4577-bb1b-fef929268726-kube-api-access-ff9mf\") pod \"0121ab07-b504-4577-bb1b-fef929268726\" (UID: \"0121ab07-b504-4577-bb1b-fef929268726\") " Mar 19 11:51:45.068025 master-0 kubenswrapper[4029]: I0319 11:51:45.067825 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0121ab07-b504-4577-bb1b-fef929268726-kube-api-access-ff9mf" (OuterVolumeSpecName: "kube-api-access-ff9mf") pod "0121ab07-b504-4577-bb1b-fef929268726" (UID: "0121ab07-b504-4577-bb1b-fef929268726"). InnerVolumeSpecName "kube-api-access-ff9mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:51:45.164623 master-0 kubenswrapper[4029]: I0319 11:51:45.164417 4029 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff9mf\" (UniqueName: \"kubernetes.io/projected/0121ab07-b504-4577-bb1b-fef929268726-kube-api-access-ff9mf\") on node \"master-0\" DevicePath \"\"" Mar 19 11:51:45.679625 master-0 kubenswrapper[4029]: E0319 11:51:45.679562 4029 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0121ab07_b504_4577_bb1b_fef929268726.slice\": RecentStats: unable to find data in memory cache]" Mar 19 11:51:45.877468 master-0 kubenswrapper[4029]: I0319 11:51:45.877381 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-vjwlk" event={"ID":"0121ab07-b504-4577-bb1b-fef929268726","Type":"ContainerDied","Data":"8fbc1c222e9490f91d6a952eefdb3b6b0e27ef8c0528bce673e1da480e2d8f19"} Mar 19 11:51:45.877468 master-0 kubenswrapper[4029]: I0319 11:51:45.877442 4029 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fbc1c222e9490f91d6a952eefdb3b6b0e27ef8c0528bce673e1da480e2d8f19" Mar 19 11:51:45.877941 master-0 kubenswrapper[4029]: I0319 11:51:45.877544 4029 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-vjwlk" Mar 19 11:51:46.654417 master-0 kubenswrapper[4029]: I0319 11:51:46.654327 4029 scope.go:117] "RemoveContainer" containerID="d4ec9f8652caf61956bb350585a200ee75b716b204eab89e8110dd9c8c54f2a5" Mar 19 11:51:46.881493 master-0 kubenswrapper[4029]: I0319 11:51:46.881364 4029 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 19 11:51:46.881858 master-0 kubenswrapper[4029]: I0319 11:51:46.881830 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"0252eb9b3a6c4d52db4e7759da29168fb6757ff67b4995374ebfa16c86b93541"} Mar 19 11:51:47.288633 master-0 kubenswrapper[4029]: I0319 11:51:47.288556 4029 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podStartSLOduration=12.288541003 podStartE2EDuration="12.288541003s" podCreationTimestamp="2026-03-19 11:51:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:51:47.288282756 +0000 UTC m=+52.365159343" watchObservedRunningTime="2026-03-19 11:51:47.288541003 +0000 UTC m=+52.365417570" Mar 19 11:51:47.431160 master-0 kubenswrapper[4029]: I0319 11:51:47.431109 4029 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-network-operator/mtu-prober-vjwlk"] Mar 19 11:51:47.432682 master-0 kubenswrapper[4029]: I0319 11:51:47.432639 4029 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-network-operator/mtu-prober-vjwlk"] Mar 19 11:51:47.656450 master-0 kubenswrapper[4029]: I0319 11:51:47.656302 4029 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0121ab07-b504-4577-bb1b-fef929268726" path="/var/lib/kubelet/pods/0121ab07-b504-4577-bb1b-fef929268726/volumes" Mar 19 11:51:50.411145 master-0 kubenswrapper[4029]: I0319 11:51:50.411046 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:51:50.411866 master-0 kubenswrapper[4029]: E0319 11:51:50.411201 4029 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:51:50.411866 master-0 kubenswrapper[4029]: E0319 11:51:50.411299 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert podName:333047c4-aeca-410e-9393-ca4e74366921 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:06.411278995 +0000 UTC m=+71.488155562 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert") pod "cluster-version-operator-56d8475767-pk574" (UID: "333047c4-aeca-410e-9393-ca4e74366921") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:51:52.302256 master-0 kubenswrapper[4029]: I0319 11:51:52.302172 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-552pc"] Mar 19 11:51:52.302984 master-0 kubenswrapper[4029]: E0319 11:51:52.302324 4029 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0121ab07-b504-4577-bb1b-fef929268726" containerName="prober" Mar 19 11:51:52.302984 master-0 kubenswrapper[4029]: I0319 11:51:52.302345 4029 state_mem.go:107] "Deleted CPUSet assignment" podUID="0121ab07-b504-4577-bb1b-fef929268726" containerName="prober" Mar 19 11:51:52.302984 master-0 kubenswrapper[4029]: I0319 11:51:52.302384 4029 memory_manager.go:354] "RemoveStaleState removing state" podUID="0121ab07-b504-4577-bb1b-fef929268726" containerName="prober" Mar 19 11:51:52.302984 master-0 kubenswrapper[4029]: I0319 11:51:52.302810 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-552pc" Mar 19 11:51:52.304700 master-0 kubenswrapper[4029]: I0319 11:51:52.304659 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 11:51:52.304912 master-0 kubenswrapper[4029]: I0319 11:51:52.304866 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 11:51:52.305202 master-0 kubenswrapper[4029]: I0319 11:51:52.305182 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 11:51:52.306359 master-0 kubenswrapper[4029]: I0319 11:51:52.306119 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 11:51:52.427612 master-0 kubenswrapper[4029]: I0319 11:51:52.427569 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-kubelet\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.427946 master-0 kubenswrapper[4029]: I0319 11:51:52.427928 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-k8s-cni-cncf-io\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.428071 master-0 kubenswrapper[4029]: I0319 11:51:52.428057 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-os-release\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.428182 master-0 kubenswrapper[4029]: I0319 11:51:52.428166 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09a22c25-6073-4b1a-a029-928452ef37db-cni-binary-copy\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.428292 master-0 kubenswrapper[4029]: I0319 11:51:52.428277 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-etc-kubernetes\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.428401 master-0 kubenswrapper[4029]: I0319 11:51:52.428386 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-system-cni-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.428492 master-0 kubenswrapper[4029]: I0319 11:51:52.428478 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-cnibin\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.428587 master-0 kubenswrapper[4029]: I0319 11:51:52.428573 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx4wk\" (UniqueName: \"kubernetes.io/projected/09a22c25-6073-4b1a-a029-928452ef37db-kube-api-access-xx4wk\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.428689 master-0 kubenswrapper[4029]: I0319 11:51:52.428674 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-cni-multus\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.428819 master-0 kubenswrapper[4029]: I0319 11:51:52.428802 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-cni-bin\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.428924 master-0 kubenswrapper[4029]: I0319 11:51:52.428908 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-conf-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.429024 master-0 kubenswrapper[4029]: I0319 11:51:52.429008 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/09a22c25-6073-4b1a-a029-928452ef37db-multus-daemon-config\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.429120 master-0 kubenswrapper[4029]: I0319 11:51:52.429105 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-cni-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.429230 master-0 kubenswrapper[4029]: I0319 11:51:52.429216 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-socket-dir-parent\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.429317 master-0 kubenswrapper[4029]: I0319 11:51:52.429304 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-netns\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.429464 master-0 kubenswrapper[4029]: I0319 11:51:52.429417 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-hostroot\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.429523 master-0 kubenswrapper[4029]: I0319 11:51:52.429463 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-multus-certs\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.503426 master-0 kubenswrapper[4029]: I0319 11:51:52.503389 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-n8vwk"] Mar 19 11:51:52.504071 master-0 kubenswrapper[4029]: I0319 11:51:52.504055 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.505850 master-0 kubenswrapper[4029]: I0319 11:51:52.505817 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 11:51:52.505912 master-0 kubenswrapper[4029]: I0319 11:51:52.505832 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 19 11:51:52.529960 master-0 kubenswrapper[4029]: I0319 11:51:52.529912 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-cni-multus\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.529960 master-0 kubenswrapper[4029]: I0319 11:51:52.529954 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-cni-bin\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530170 master-0 kubenswrapper[4029]: I0319 11:51:52.529976 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-conf-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530170 master-0 kubenswrapper[4029]: I0319 11:51:52.529993 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/09a22c25-6073-4b1a-a029-928452ef37db-multus-daemon-config\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530170 master-0 kubenswrapper[4029]: I0319 11:51:52.530056 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-cni-bin\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530259 master-0 kubenswrapper[4029]: I0319 11:51:52.530168 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-cni-multus\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530259 master-0 kubenswrapper[4029]: I0319 11:51:52.530219 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-cni-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530259 master-0 kubenswrapper[4029]: I0319 11:51:52.530241 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-socket-dir-parent\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530259 master-0 kubenswrapper[4029]: I0319 11:51:52.530246 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-conf-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530259 master-0 kubenswrapper[4029]: I0319 11:51:52.530257 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-netns\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530388 master-0 kubenswrapper[4029]: I0319 11:51:52.530303 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-socket-dir-parent\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530418 master-0 kubenswrapper[4029]: I0319 11:51:52.530389 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-netns\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530418 master-0 kubenswrapper[4029]: I0319 11:51:52.530409 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-cni-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530472 master-0 kubenswrapper[4029]: I0319 11:51:52.530417 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-hostroot\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530472 master-0 kubenswrapper[4029]: I0319 11:51:52.530451 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-multus-certs\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530528 master-0 kubenswrapper[4029]: I0319 11:51:52.530484 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-multus-certs\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530528 master-0 kubenswrapper[4029]: I0319 11:51:52.530484 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-hostroot\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530528 master-0 kubenswrapper[4029]: I0319 11:51:52.530503 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-kubelet\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530528 master-0 kubenswrapper[4029]: I0319 11:51:52.530522 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-kubelet\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530528 master-0 kubenswrapper[4029]: I0319 11:51:52.530530 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09a22c25-6073-4b1a-a029-928452ef37db-cni-binary-copy\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530659 master-0 kubenswrapper[4029]: I0319 11:51:52.530551 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-k8s-cni-cncf-io\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530659 master-0 kubenswrapper[4029]: I0319 11:51:52.530578 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-os-release\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530659 master-0 kubenswrapper[4029]: I0319 11:51:52.530593 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-etc-kubernetes\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530659 master-0 kubenswrapper[4029]: I0319 11:51:52.530616 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-k8s-cni-cncf-io\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530659 master-0 kubenswrapper[4029]: I0319 11:51:52.530638 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-system-cni-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530659 master-0 kubenswrapper[4029]: I0319 11:51:52.530655 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-cnibin\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530834 master-0 kubenswrapper[4029]: I0319 11:51:52.530670 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx4wk\" (UniqueName: \"kubernetes.io/projected/09a22c25-6073-4b1a-a029-928452ef37db-kube-api-access-xx4wk\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530834 master-0 kubenswrapper[4029]: I0319 11:51:52.530640 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-os-release\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530834 master-0 kubenswrapper[4029]: I0319 11:51:52.530749 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-etc-kubernetes\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530962 master-0 kubenswrapper[4029]: I0319 11:51:52.530907 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-system-cni-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.530999 master-0 kubenswrapper[4029]: I0319 11:51:52.530937 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-cnibin\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.531098 master-0 kubenswrapper[4029]: I0319 11:51:52.531073 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/09a22c25-6073-4b1a-a029-928452ef37db-multus-daemon-config\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.531215 master-0 kubenswrapper[4029]: I0319 11:51:52.531193 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09a22c25-6073-4b1a-a029-928452ef37db-cni-binary-copy\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.545797 master-0 kubenswrapper[4029]: I0319 11:51:52.545516 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx4wk\" (UniqueName: \"kubernetes.io/projected/09a22c25-6073-4b1a-a029-928452ef37db-kube-api-access-xx4wk\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:51:52.619175 master-0 kubenswrapper[4029]: I0319 11:51:52.618954 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-552pc" Mar 19 11:51:52.634433 master-0 kubenswrapper[4029]: I0319 11:51:52.634144 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.634433 master-0 kubenswrapper[4029]: I0319 11:51:52.634225 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-os-release\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.634433 master-0 kubenswrapper[4029]: I0319 11:51:52.634259 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.634433 master-0 kubenswrapper[4029]: I0319 11:51:52.634324 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-system-cni-dir\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.634838 master-0 kubenswrapper[4029]: I0319 11:51:52.634435 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqcvx\" (UniqueName: \"kubernetes.io/projected/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-kube-api-access-lqcvx\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.634838 master-0 kubenswrapper[4029]: I0319 11:51:52.634523 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.634838 master-0 kubenswrapper[4029]: I0319 11:51:52.634577 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cnibin\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.634838 master-0 kubenswrapper[4029]: I0319 11:51:52.634623 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cni-binary-copy\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.735557 master-0 kubenswrapper[4029]: I0319 11:51:52.735479 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.735557 master-0 kubenswrapper[4029]: I0319 11:51:52.735547 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cni-binary-copy\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.735557 master-0 kubenswrapper[4029]: I0319 11:51:52.735577 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cnibin\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.735927 master-0 kubenswrapper[4029]: I0319 11:51:52.735598 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.735927 master-0 kubenswrapper[4029]: I0319 11:51:52.735624 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-os-release\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.735927 master-0 kubenswrapper[4029]: I0319 11:51:52.735646 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.735927 master-0 kubenswrapper[4029]: I0319 11:51:52.735797 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cnibin\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.735927 master-0 kubenswrapper[4029]: I0319 11:51:52.735890 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-system-cni-dir\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.736127 master-0 kubenswrapper[4029]: I0319 11:51:52.735940 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqcvx\" (UniqueName: \"kubernetes.io/projected/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-kube-api-access-lqcvx\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.736127 master-0 kubenswrapper[4029]: I0319 11:51:52.735948 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-system-cni-dir\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.736127 master-0 kubenswrapper[4029]: I0319 11:51:52.736101 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.736127 master-0 kubenswrapper[4029]: I0319 11:51:52.736117 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-os-release\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.737035 master-0 kubenswrapper[4029]: I0319 11:51:52.736966 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.738422 master-0 kubenswrapper[4029]: I0319 11:51:52.738251 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cni-binary-copy\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.738422 master-0 kubenswrapper[4029]: I0319 11:51:52.738384 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.758362 master-0 kubenswrapper[4029]: I0319 11:51:52.758306 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqcvx\" (UniqueName: \"kubernetes.io/projected/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-kube-api-access-lqcvx\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.814635 master-0 kubenswrapper[4029]: I0319 11:51:52.814584 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:51:52.827016 master-0 kubenswrapper[4029]: W0319 11:51:52.826980 4029 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd8ff97d_047e_4ea7_ba6c_9fbc5da0514a.slice/crio-d7889469ef63ab146c50d169dc4f57ff3c6e05bfe52d83c88832208089809932 WatchSource:0}: Error finding container d7889469ef63ab146c50d169dc4f57ff3c6e05bfe52d83c88832208089809932: Status 404 returned error can't find the container with id d7889469ef63ab146c50d169dc4f57ff3c6e05bfe52d83c88832208089809932 Mar 19 11:51:52.899935 master-0 kubenswrapper[4029]: I0319 11:51:52.899701 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-552pc" event={"ID":"09a22c25-6073-4b1a-a029-928452ef37db","Type":"ContainerStarted","Data":"d016f849de27f64d027bbd73120eb329b0253680086fad1c1a5d1d59daba5c27"} Mar 19 11:51:52.901062 master-0 kubenswrapper[4029]: I0319 11:51:52.901024 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n8vwk" event={"ID":"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a","Type":"ContainerStarted","Data":"d7889469ef63ab146c50d169dc4f57ff3c6e05bfe52d83c88832208089809932"} Mar 19 11:51:53.293178 master-0 kubenswrapper[4029]: I0319 11:51:53.293131 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-f6wv7"] Mar 19 11:51:53.293942 master-0 kubenswrapper[4029]: I0319 11:51:53.293916 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:51:53.294163 master-0 kubenswrapper[4029]: E0319 11:51:53.294129 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:51:53.440393 master-0 kubenswrapper[4029]: I0319 11:51:53.440348 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgs4l\" (UniqueName: \"kubernetes.io/projected/f29b11ce-60e0-46b3-8d28-eea3452513cd-kube-api-access-bgs4l\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:51:53.441036 master-0 kubenswrapper[4029]: I0319 11:51:53.440413 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:51:53.541506 master-0 kubenswrapper[4029]: I0319 11:51:53.541463 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:51:53.541806 master-0 kubenswrapper[4029]: I0319 11:51:53.541791 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgs4l\" (UniqueName: \"kubernetes.io/projected/f29b11ce-60e0-46b3-8d28-eea3452513cd-kube-api-access-bgs4l\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:51:53.541879 master-0 kubenswrapper[4029]: E0319 11:51:53.541571 4029 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:51:53.541984 master-0 kubenswrapper[4029]: E0319 11:51:53.541973 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs podName:f29b11ce-60e0-46b3-8d28-eea3452513cd nodeName:}" failed. No retries permitted until 2026-03-19 11:51:54.041956163 +0000 UTC m=+59.118832740 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs") pod "network-metrics-daemon-f6wv7" (UID: "f29b11ce-60e0-46b3-8d28-eea3452513cd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:51:53.563010 master-0 kubenswrapper[4029]: I0319 11:51:53.562864 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgs4l\" (UniqueName: \"kubernetes.io/projected/f29b11ce-60e0-46b3-8d28-eea3452513cd-kube-api-access-bgs4l\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:51:54.045377 master-0 kubenswrapper[4029]: I0319 11:51:54.045302 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:51:54.045610 master-0 kubenswrapper[4029]: E0319 11:51:54.045471 4029 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:51:54.045610 master-0 kubenswrapper[4029]: E0319 11:51:54.045560 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs podName:f29b11ce-60e0-46b3-8d28-eea3452513cd nodeName:}" failed. No retries permitted until 2026-03-19 11:51:55.045527876 +0000 UTC m=+60.122404443 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs") pod "network-metrics-daemon-f6wv7" (UID: "f29b11ce-60e0-46b3-8d28-eea3452513cd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:51:54.652069 master-0 kubenswrapper[4029]: I0319 11:51:54.652004 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:51:54.652607 master-0 kubenswrapper[4029]: E0319 11:51:54.652142 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:51:55.054352 master-0 kubenswrapper[4029]: I0319 11:51:55.054260 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:51:55.054594 master-0 kubenswrapper[4029]: E0319 11:51:55.054410 4029 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:51:55.054594 master-0 kubenswrapper[4029]: E0319 11:51:55.054461 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs podName:f29b11ce-60e0-46b3-8d28-eea3452513cd nodeName:}" failed. No retries permitted until 2026-03-19 11:51:57.054448621 +0000 UTC m=+62.131325188 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs") pod "network-metrics-daemon-f6wv7" (UID: "f29b11ce-60e0-46b3-8d28-eea3452513cd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:51:56.653423 master-0 kubenswrapper[4029]: I0319 11:51:56.652555 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:51:56.653423 master-0 kubenswrapper[4029]: E0319 11:51:56.653004 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:51:56.910998 master-0 kubenswrapper[4029]: I0319 11:51:56.909937 4029 generic.go:334] "Generic (PLEG): container finished" podID="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" containerID="de0412fe0521ed4585e79b055942133d1bae28dd08d3cd77acada0e7dc47ebba" exitCode=0 Mar 19 11:51:56.910998 master-0 kubenswrapper[4029]: I0319 11:51:56.909994 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n8vwk" event={"ID":"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a","Type":"ContainerDied","Data":"de0412fe0521ed4585e79b055942133d1bae28dd08d3cd77acada0e7dc47ebba"} Mar 19 11:51:57.081721 master-0 kubenswrapper[4029]: I0319 11:51:57.081637 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:51:57.081973 master-0 kubenswrapper[4029]: E0319 11:51:57.081866 4029 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:51:57.081973 master-0 kubenswrapper[4029]: E0319 11:51:57.081970 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs podName:f29b11ce-60e0-46b3-8d28-eea3452513cd nodeName:}" failed. No retries permitted until 2026-03-19 11:52:01.081948638 +0000 UTC m=+66.158825225 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs") pod "network-metrics-daemon-f6wv7" (UID: "f29b11ce-60e0-46b3-8d28-eea3452513cd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:51:58.652560 master-0 kubenswrapper[4029]: I0319 11:51:58.652098 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:51:58.652560 master-0 kubenswrapper[4029]: E0319 11:51:58.652231 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:00.652683 master-0 kubenswrapper[4029]: I0319 11:52:00.652629 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:00.654689 master-0 kubenswrapper[4029]: E0319 11:52:00.652788 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:01.287079 master-0 kubenswrapper[4029]: I0319 11:52:01.286940 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:01.287331 master-0 kubenswrapper[4029]: E0319 11:52:01.287156 4029 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:52:01.287331 master-0 kubenswrapper[4029]: E0319 11:52:01.287266 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs podName:f29b11ce-60e0-46b3-8d28-eea3452513cd nodeName:}" failed. No retries permitted until 2026-03-19 11:52:09.287238306 +0000 UTC m=+74.364114863 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs") pod "network-metrics-daemon-f6wv7" (UID: "f29b11ce-60e0-46b3-8d28-eea3452513cd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:52:02.652409 master-0 kubenswrapper[4029]: I0319 11:52:02.652333 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:02.653057 master-0 kubenswrapper[4029]: E0319 11:52:02.652486 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:04.652560 master-0 kubenswrapper[4029]: I0319 11:52:04.652451 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:04.653321 master-0 kubenswrapper[4029]: E0319 11:52:04.652637 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:04.719539 master-0 kubenswrapper[4029]: I0319 11:52:04.719497 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd"] Mar 19 11:52:04.720276 master-0 kubenswrapper[4029]: I0319 11:52:04.720251 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:52:04.723669 master-0 kubenswrapper[4029]: I0319 11:52:04.723632 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 11:52:04.724539 master-0 kubenswrapper[4029]: I0319 11:52:04.724497 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 11:52:04.724630 master-0 kubenswrapper[4029]: I0319 11:52:04.724614 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 11:52:04.724757 master-0 kubenswrapper[4029]: I0319 11:52:04.724719 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 11:52:04.726025 master-0 kubenswrapper[4029]: I0319 11:52:04.725984 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 11:52:04.907953 master-0 kubenswrapper[4029]: I0319 11:52:04.907832 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-m724d"] Mar 19 11:52:04.908812 master-0 kubenswrapper[4029]: I0319 11:52:04.908786 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:04.910762 master-0 kubenswrapper[4029]: I0319 11:52:04.910715 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 11:52:04.911598 master-0 kubenswrapper[4029]: I0319 11:52:04.911579 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 11:52:04.918171 master-0 kubenswrapper[4029]: I0319 11:52:04.918125 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/daf4dbb6-5a0a-4c92-a930-479a7330ace1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:52:04.918299 master-0 kubenswrapper[4029]: I0319 11:52:04.918200 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72jlb\" (UniqueName: \"kubernetes.io/projected/daf4dbb6-5a0a-4c92-a930-479a7330ace1-kube-api-access-72jlb\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:52:04.918299 master-0 kubenswrapper[4029]: I0319 11:52:04.918228 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/daf4dbb6-5a0a-4c92-a930-479a7330ace1-env-overrides\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:52:04.918299 master-0 kubenswrapper[4029]: I0319 11:52:04.918252 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/daf4dbb6-5a0a-4c92-a930-479a7330ace1-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:52:05.019405 master-0 kubenswrapper[4029]: I0319 11:52:05.019338 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/daf4dbb6-5a0a-4c92-a930-479a7330ace1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:52:05.019405 master-0 kubenswrapper[4029]: I0319 11:52:05.019394 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.019669 master-0 kubenswrapper[4029]: I0319 11:52:05.019452 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-run-netns\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.019669 master-0 kubenswrapper[4029]: I0319 11:52:05.019470 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7zf7\" (UniqueName: \"kubernetes.io/projected/9e5f2654-1acc-4938-8f86-ba5328fccfcc-kube-api-access-x7zf7\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.019669 master-0 kubenswrapper[4029]: I0319 11:52:05.019494 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72jlb\" (UniqueName: \"kubernetes.io/projected/daf4dbb6-5a0a-4c92-a930-479a7330ace1-kube-api-access-72jlb\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:52:05.019669 master-0 kubenswrapper[4029]: I0319 11:52:05.019518 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/daf4dbb6-5a0a-4c92-a930-479a7330ace1-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:52:05.019669 master-0 kubenswrapper[4029]: I0319 11:52:05.019534 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/daf4dbb6-5a0a-4c92-a930-479a7330ace1-env-overrides\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:52:05.019669 master-0 kubenswrapper[4029]: I0319 11:52:05.019558 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9e5f2654-1acc-4938-8f86-ba5328fccfcc-env-overrides\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.019669 master-0 kubenswrapper[4029]: I0319 11:52:05.019617 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-run-openvswitch\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.019669 master-0 kubenswrapper[4029]: I0319 11:52:05.019654 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-node-log\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.019669 master-0 kubenswrapper[4029]: I0319 11:52:05.019668 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-log-socket\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.020008 master-0 kubenswrapper[4029]: I0319 11:52:05.019686 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-cni-bin\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.020008 master-0 kubenswrapper[4029]: I0319 11:52:05.019704 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9e5f2654-1acc-4938-8f86-ba5328fccfcc-ovnkube-config\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.020008 master-0 kubenswrapper[4029]: I0319 11:52:05.019751 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-kubelet\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.020008 master-0 kubenswrapper[4029]: I0319 11:52:05.019765 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-slash\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.020008 master-0 kubenswrapper[4029]: I0319 11:52:05.019779 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-etc-openvswitch\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.020008 master-0 kubenswrapper[4029]: I0319 11:52:05.019795 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e5f2654-1acc-4938-8f86-ba5328fccfcc-ovn-node-metrics-cert\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.020008 master-0 kubenswrapper[4029]: I0319 11:52:05.019811 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-run-systemd\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.020008 master-0 kubenswrapper[4029]: I0319 11:52:05.019824 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-var-lib-openvswitch\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.020008 master-0 kubenswrapper[4029]: I0319 11:52:05.019837 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-run-ovn\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.020008 master-0 kubenswrapper[4029]: I0319 11:52:05.019854 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-systemd-units\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.020008 master-0 kubenswrapper[4029]: I0319 11:52:05.019869 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-run-ovn-kubernetes\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.020008 master-0 kubenswrapper[4029]: I0319 11:52:05.019891 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-cni-netd\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.020008 master-0 kubenswrapper[4029]: I0319 11:52:05.019919 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9e5f2654-1acc-4938-8f86-ba5328fccfcc-ovnkube-script-lib\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.020577 master-0 kubenswrapper[4029]: I0319 11:52:05.020551 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/daf4dbb6-5a0a-4c92-a930-479a7330ace1-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:52:05.022395 master-0 kubenswrapper[4029]: I0319 11:52:05.022344 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/daf4dbb6-5a0a-4c92-a930-479a7330ace1-env-overrides\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:52:05.033176 master-0 kubenswrapper[4029]: I0319 11:52:05.033130 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/daf4dbb6-5a0a-4c92-a930-479a7330ace1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:52:05.035864 master-0 kubenswrapper[4029]: I0319 11:52:05.035834 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72jlb\" (UniqueName: \"kubernetes.io/projected/daf4dbb6-5a0a-4c92-a930-479a7330ace1-kube-api-access-72jlb\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:52:05.047666 master-0 kubenswrapper[4029]: I0319 11:52:05.047614 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:52:05.120722 master-0 kubenswrapper[4029]: I0319 11:52:05.120651 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9e5f2654-1acc-4938-8f86-ba5328fccfcc-env-overrides\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.120722 master-0 kubenswrapper[4029]: I0319 11:52:05.120710 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-run-openvswitch\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.120987 master-0 kubenswrapper[4029]: I0319 11:52:05.120752 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-node-log\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.120987 master-0 kubenswrapper[4029]: I0319 11:52:05.120777 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-log-socket\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.120987 master-0 kubenswrapper[4029]: I0319 11:52:05.120800 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-cni-bin\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.120987 master-0 kubenswrapper[4029]: I0319 11:52:05.120818 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9e5f2654-1acc-4938-8f86-ba5328fccfcc-ovnkube-config\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.120987 master-0 kubenswrapper[4029]: I0319 11:52:05.120838 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-kubelet\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.120987 master-0 kubenswrapper[4029]: I0319 11:52:05.120855 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-slash\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.120987 master-0 kubenswrapper[4029]: I0319 11:52:05.120874 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-etc-openvswitch\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.120987 master-0 kubenswrapper[4029]: I0319 11:52:05.120892 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e5f2654-1acc-4938-8f86-ba5328fccfcc-ovn-node-metrics-cert\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.120987 master-0 kubenswrapper[4029]: I0319 11:52:05.120909 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-systemd-units\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.120987 master-0 kubenswrapper[4029]: I0319 11:52:05.120934 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-run-systemd\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.121253 master-0 kubenswrapper[4029]: I0319 11:52:05.120955 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-var-lib-openvswitch\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.121253 master-0 kubenswrapper[4029]: I0319 11:52:05.121033 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-run-ovn\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.121253 master-0 kubenswrapper[4029]: I0319 11:52:05.121060 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-run-ovn-kubernetes\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.121253 master-0 kubenswrapper[4029]: I0319 11:52:05.121078 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-cni-netd\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.121253 master-0 kubenswrapper[4029]: I0319 11:52:05.121098 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9e5f2654-1acc-4938-8f86-ba5328fccfcc-ovnkube-script-lib\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.121253 master-0 kubenswrapper[4029]: I0319 11:52:05.121119 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.121253 master-0 kubenswrapper[4029]: I0319 11:52:05.121161 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7zf7\" (UniqueName: \"kubernetes.io/projected/9e5f2654-1acc-4938-8f86-ba5328fccfcc-kube-api-access-x7zf7\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.121253 master-0 kubenswrapper[4029]: I0319 11:52:05.121178 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-run-netns\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.121253 master-0 kubenswrapper[4029]: I0319 11:52:05.121245 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-run-netns\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.122016 master-0 kubenswrapper[4029]: I0319 11:52:05.121990 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9e5f2654-1acc-4938-8f86-ba5328fccfcc-env-overrides\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.122069 master-0 kubenswrapper[4029]: I0319 11:52:05.122031 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-run-openvswitch\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.122069 master-0 kubenswrapper[4029]: I0319 11:52:05.122052 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-node-log\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.122124 master-0 kubenswrapper[4029]: I0319 11:52:05.122074 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-log-socket\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.122124 master-0 kubenswrapper[4029]: I0319 11:52:05.122097 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-cni-bin\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.122580 master-0 kubenswrapper[4029]: I0319 11:52:05.122554 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9e5f2654-1acc-4938-8f86-ba5328fccfcc-ovnkube-config\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.122644 master-0 kubenswrapper[4029]: I0319 11:52:05.122596 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-kubelet\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.122644 master-0 kubenswrapper[4029]: I0319 11:52:05.122629 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-slash\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.122716 master-0 kubenswrapper[4029]: I0319 11:52:05.122658 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-etc-openvswitch\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.123120 master-0 kubenswrapper[4029]: I0319 11:52:05.123083 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-run-ovn-kubernetes\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.123191 master-0 kubenswrapper[4029]: I0319 11:52:05.123142 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-var-lib-openvswitch\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.123191 master-0 kubenswrapper[4029]: I0319 11:52:05.123182 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-run-systemd\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.123276 master-0 kubenswrapper[4029]: I0319 11:52:05.123195 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-systemd-units\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.123276 master-0 kubenswrapper[4029]: I0319 11:52:05.123270 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-cni-netd\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.123358 master-0 kubenswrapper[4029]: I0319 11:52:05.123310 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.123618 master-0 kubenswrapper[4029]: I0319 11:52:05.123227 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-run-ovn\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.124076 master-0 kubenswrapper[4029]: I0319 11:52:05.124035 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9e5f2654-1acc-4938-8f86-ba5328fccfcc-ovnkube-script-lib\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.130483 master-0 kubenswrapper[4029]: I0319 11:52:05.130439 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e5f2654-1acc-4938-8f86-ba5328fccfcc-ovn-node-metrics-cert\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.143837 master-0 kubenswrapper[4029]: I0319 11:52:05.143774 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7zf7\" (UniqueName: \"kubernetes.io/projected/9e5f2654-1acc-4938-8f86-ba5328fccfcc-kube-api-access-x7zf7\") pod \"ovnkube-node-m724d\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:05.223551 master-0 kubenswrapper[4029]: I0319 11:52:05.222989 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:06.430667 master-0 kubenswrapper[4029]: I0319 11:52:06.430576 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:52:06.431230 master-0 kubenswrapper[4029]: E0319 11:52:06.430779 4029 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:52:06.431230 master-0 kubenswrapper[4029]: E0319 11:52:06.430868 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert podName:333047c4-aeca-410e-9393-ca4e74366921 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:38.430847502 +0000 UTC m=+103.507724069 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert") pod "cluster-version-operator-56d8475767-pk574" (UID: "333047c4-aeca-410e-9393-ca4e74366921") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:52:06.652521 master-0 kubenswrapper[4029]: I0319 11:52:06.652464 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:06.652763 master-0 kubenswrapper[4029]: E0319 11:52:06.652591 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:07.889297 master-0 kubenswrapper[4029]: I0319 11:52:07.889235 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-cr8n7"] Mar 19 11:52:07.890611 master-0 kubenswrapper[4029]: I0319 11:52:07.890483 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:07.890611 master-0 kubenswrapper[4029]: E0319 11:52:07.890549 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr8n7" podUID="6230ed8f-4608-4168-8f5a-656f411b0ef7" Mar 19 11:52:08.043809 master-0 kubenswrapper[4029]: I0319 11:52:08.043705 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzrh8\" (UniqueName: \"kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8\") pod \"network-check-target-cr8n7\" (UID: \"6230ed8f-4608-4168-8f5a-656f411b0ef7\") " pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:08.145054 master-0 kubenswrapper[4029]: I0319 11:52:08.144864 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzrh8\" (UniqueName: \"kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8\") pod \"network-check-target-cr8n7\" (UID: \"6230ed8f-4608-4168-8f5a-656f411b0ef7\") " pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:08.159842 master-0 kubenswrapper[4029]: E0319 11:52:08.159807 4029 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 11:52:08.159842 master-0 kubenswrapper[4029]: E0319 11:52:08.159841 4029 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 11:52:08.160035 master-0 kubenswrapper[4029]: E0319 11:52:08.159856 4029 projected.go:194] Error preparing data for projected volume kube-api-access-wzrh8 for pod openshift-network-diagnostics/network-check-target-cr8n7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:52:08.160035 master-0 kubenswrapper[4029]: E0319 11:52:08.159917 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8 podName:6230ed8f-4608-4168-8f5a-656f411b0ef7 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:08.659900955 +0000 UTC m=+73.736777522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wzrh8" (UniqueName: "kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8") pod "network-check-target-cr8n7" (UID: "6230ed8f-4608-4168-8f5a-656f411b0ef7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:52:08.652441 master-0 kubenswrapper[4029]: I0319 11:52:08.652380 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:08.652793 master-0 kubenswrapper[4029]: E0319 11:52:08.652569 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:08.749343 master-0 kubenswrapper[4029]: I0319 11:52:08.749173 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzrh8\" (UniqueName: \"kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8\") pod \"network-check-target-cr8n7\" (UID: \"6230ed8f-4608-4168-8f5a-656f411b0ef7\") " pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:08.749626 master-0 kubenswrapper[4029]: E0319 11:52:08.749382 4029 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 11:52:08.749626 master-0 kubenswrapper[4029]: E0319 11:52:08.749400 4029 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 11:52:08.749626 master-0 kubenswrapper[4029]: E0319 11:52:08.749413 4029 projected.go:194] Error preparing data for projected volume kube-api-access-wzrh8 for pod openshift-network-diagnostics/network-check-target-cr8n7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:52:08.749626 master-0 kubenswrapper[4029]: E0319 11:52:08.749466 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8 podName:6230ed8f-4608-4168-8f5a-656f411b0ef7 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:09.749453583 +0000 UTC m=+74.826330150 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-wzrh8" (UniqueName: "kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8") pod "network-check-target-cr8n7" (UID: "6230ed8f-4608-4168-8f5a-656f411b0ef7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:52:09.354918 master-0 kubenswrapper[4029]: I0319 11:52:09.354829 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:09.355562 master-0 kubenswrapper[4029]: E0319 11:52:09.355073 4029 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:52:09.355562 master-0 kubenswrapper[4029]: E0319 11:52:09.355209 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs podName:f29b11ce-60e0-46b3-8d28-eea3452513cd nodeName:}" failed. No retries permitted until 2026-03-19 11:52:25.355187886 +0000 UTC m=+90.432064463 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs") pod "network-metrics-daemon-f6wv7" (UID: "f29b11ce-60e0-46b3-8d28-eea3452513cd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:52:09.652331 master-0 kubenswrapper[4029]: I0319 11:52:09.652189 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:09.652331 master-0 kubenswrapper[4029]: E0319 11:52:09.652298 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr8n7" podUID="6230ed8f-4608-4168-8f5a-656f411b0ef7" Mar 19 11:52:09.757033 master-0 kubenswrapper[4029]: I0319 11:52:09.756959 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzrh8\" (UniqueName: \"kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8\") pod \"network-check-target-cr8n7\" (UID: \"6230ed8f-4608-4168-8f5a-656f411b0ef7\") " pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:09.757298 master-0 kubenswrapper[4029]: E0319 11:52:09.757151 4029 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 11:52:09.757298 master-0 kubenswrapper[4029]: E0319 11:52:09.757201 4029 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 11:52:09.757298 master-0 kubenswrapper[4029]: E0319 11:52:09.757215 4029 projected.go:194] Error preparing data for projected volume kube-api-access-wzrh8 for pod openshift-network-diagnostics/network-check-target-cr8n7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:52:09.757298 master-0 kubenswrapper[4029]: E0319 11:52:09.757271 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8 podName:6230ed8f-4608-4168-8f5a-656f411b0ef7 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:11.757256921 +0000 UTC m=+76.834133488 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-wzrh8" (UniqueName: "kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8") pod "network-check-target-cr8n7" (UID: "6230ed8f-4608-4168-8f5a-656f411b0ef7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:52:10.652749 master-0 kubenswrapper[4029]: I0319 11:52:10.652585 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:10.652749 master-0 kubenswrapper[4029]: E0319 11:52:10.652780 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:11.652633 master-0 kubenswrapper[4029]: I0319 11:52:11.652557 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:11.652975 master-0 kubenswrapper[4029]: E0319 11:52:11.652775 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr8n7" podUID="6230ed8f-4608-4168-8f5a-656f411b0ef7" Mar 19 11:52:11.774706 master-0 kubenswrapper[4029]: I0319 11:52:11.774508 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzrh8\" (UniqueName: \"kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8\") pod \"network-check-target-cr8n7\" (UID: \"6230ed8f-4608-4168-8f5a-656f411b0ef7\") " pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:11.774976 master-0 kubenswrapper[4029]: E0319 11:52:11.774836 4029 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 11:52:11.774976 master-0 kubenswrapper[4029]: E0319 11:52:11.774942 4029 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 11:52:11.774976 master-0 kubenswrapper[4029]: E0319 11:52:11.774966 4029 projected.go:194] Error preparing data for projected volume kube-api-access-wzrh8 for pod openshift-network-diagnostics/network-check-target-cr8n7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:52:11.775104 master-0 kubenswrapper[4029]: E0319 11:52:11.775058 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8 podName:6230ed8f-4608-4168-8f5a-656f411b0ef7 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:15.775032429 +0000 UTC m=+80.851909036 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-wzrh8" (UniqueName: "kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8") pod "network-check-target-cr8n7" (UID: "6230ed8f-4608-4168-8f5a-656f411b0ef7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:52:11.882464 master-0 kubenswrapper[4029]: W0319 11:52:11.882410 4029 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaf4dbb6_5a0a_4c92_a930_479a7330ace1.slice/crio-88f5dfffda4adf62f6636e4646d2c851172ef321255a628934b6453ee67c8f03 WatchSource:0}: Error finding container 88f5dfffda4adf62f6636e4646d2c851172ef321255a628934b6453ee67c8f03: Status 404 returned error can't find the container with id 88f5dfffda4adf62f6636e4646d2c851172ef321255a628934b6453ee67c8f03 Mar 19 11:52:11.885061 master-0 kubenswrapper[4029]: W0319 11:52:11.885034 4029 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e5f2654_1acc_4938_8f86_ba5328fccfcc.slice/crio-ac789a38dd13a595ac15b76c4a526da0f4604507ea8e8d54ee1ca913a0fc96b9 WatchSource:0}: Error finding container ac789a38dd13a595ac15b76c4a526da0f4604507ea8e8d54ee1ca913a0fc96b9: Status 404 returned error can't find the container with id ac789a38dd13a595ac15b76c4a526da0f4604507ea8e8d54ee1ca913a0fc96b9 Mar 19 11:52:11.945969 master-0 kubenswrapper[4029]: I0319 11:52:11.945911 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m724d" event={"ID":"9e5f2654-1acc-4938-8f86-ba5328fccfcc","Type":"ContainerStarted","Data":"ac789a38dd13a595ac15b76c4a526da0f4604507ea8e8d54ee1ca913a0fc96b9"} Mar 19 11:52:11.946940 master-0 kubenswrapper[4029]: I0319 11:52:11.946908 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" event={"ID":"daf4dbb6-5a0a-4c92-a930-479a7330ace1","Type":"ContainerStarted","Data":"88f5dfffda4adf62f6636e4646d2c851172ef321255a628934b6453ee67c8f03"} Mar 19 11:52:11.965866 master-0 kubenswrapper[4029]: I0319 11:52:11.965783 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 19 11:52:12.292179 master-0 kubenswrapper[4029]: I0319 11:52:12.292038 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-j528w"] Mar 19 11:52:12.298290 master-0 kubenswrapper[4029]: I0319 11:52:12.294476 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:52:12.298290 master-0 kubenswrapper[4029]: I0319 11:52:12.295950 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 11:52:12.298290 master-0 kubenswrapper[4029]: I0319 11:52:12.296558 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 11:52:12.298290 master-0 kubenswrapper[4029]: I0319 11:52:12.296919 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 11:52:12.298290 master-0 kubenswrapper[4029]: I0319 11:52:12.296970 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 11:52:12.298290 master-0 kubenswrapper[4029]: I0319 11:52:12.298082 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 11:52:12.306650 master-0 kubenswrapper[4029]: I0319 11:52:12.306556 4029 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-scheduler-master-0" podStartSLOduration=1.306542839 podStartE2EDuration="1.306542839s" podCreationTimestamp="2026-03-19 11:52:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:52:12.306301894 +0000 UTC m=+77.383178461" watchObservedRunningTime="2026-03-19 11:52:12.306542839 +0000 UTC m=+77.383419406" Mar 19 11:52:12.380754 master-0 kubenswrapper[4029]: I0319 11:52:12.378934 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqr6w\" (UniqueName: \"kubernetes.io/projected/8438d015-106b-4aed-ae12-dda781ce51fc-kube-api-access-cqr6w\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:52:12.380754 master-0 kubenswrapper[4029]: I0319 11:52:12.378992 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8438d015-106b-4aed-ae12-dda781ce51fc-webhook-cert\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:52:12.380754 master-0 kubenswrapper[4029]: I0319 11:52:12.379021 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/8438d015-106b-4aed-ae12-dda781ce51fc-ovnkube-identity-cm\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:52:12.380754 master-0 kubenswrapper[4029]: I0319 11:52:12.379037 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8438d015-106b-4aed-ae12-dda781ce51fc-env-overrides\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:52:12.479679 master-0 kubenswrapper[4029]: I0319 11:52:12.479547 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/8438d015-106b-4aed-ae12-dda781ce51fc-ovnkube-identity-cm\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:52:12.479679 master-0 kubenswrapper[4029]: I0319 11:52:12.479609 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8438d015-106b-4aed-ae12-dda781ce51fc-env-overrides\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:52:12.480473 master-0 kubenswrapper[4029]: I0319 11:52:12.479825 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqr6w\" (UniqueName: \"kubernetes.io/projected/8438d015-106b-4aed-ae12-dda781ce51fc-kube-api-access-cqr6w\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:52:12.480473 master-0 kubenswrapper[4029]: I0319 11:52:12.479869 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8438d015-106b-4aed-ae12-dda781ce51fc-webhook-cert\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:52:12.480473 master-0 kubenswrapper[4029]: E0319 11:52:12.480169 4029 secret.go:189] Couldn't get secret openshift-network-node-identity/network-node-identity-cert: secret "network-node-identity-cert" not found Mar 19 11:52:12.480473 master-0 kubenswrapper[4029]: E0319 11:52:12.480239 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8438d015-106b-4aed-ae12-dda781ce51fc-webhook-cert podName:8438d015-106b-4aed-ae12-dda781ce51fc nodeName:}" failed. No retries permitted until 2026-03-19 11:52:12.980222472 +0000 UTC m=+78.057099039 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/8438d015-106b-4aed-ae12-dda781ce51fc-webhook-cert") pod "network-node-identity-j528w" (UID: "8438d015-106b-4aed-ae12-dda781ce51fc") : secret "network-node-identity-cert" not found Mar 19 11:52:12.480879 master-0 kubenswrapper[4029]: I0319 11:52:12.480831 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/8438d015-106b-4aed-ae12-dda781ce51fc-ovnkube-identity-cm\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:52:12.480938 master-0 kubenswrapper[4029]: I0319 11:52:12.480888 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8438d015-106b-4aed-ae12-dda781ce51fc-env-overrides\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:52:12.501063 master-0 kubenswrapper[4029]: I0319 11:52:12.500978 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqr6w\" (UniqueName: \"kubernetes.io/projected/8438d015-106b-4aed-ae12-dda781ce51fc-kube-api-access-cqr6w\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:52:12.652128 master-0 kubenswrapper[4029]: I0319 11:52:12.651967 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:12.652335 master-0 kubenswrapper[4029]: E0319 11:52:12.652129 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:12.951571 master-0 kubenswrapper[4029]: I0319 11:52:12.951519 4029 generic.go:334] "Generic (PLEG): container finished" podID="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" containerID="ac2545e0b2dd4885511fea2e8cd975f1d1867cae6d7a8bfbf5aa8fba195a8d88" exitCode=0 Mar 19 11:52:12.951571 master-0 kubenswrapper[4029]: I0319 11:52:12.951571 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n8vwk" event={"ID":"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a","Type":"ContainerDied","Data":"ac2545e0b2dd4885511fea2e8cd975f1d1867cae6d7a8bfbf5aa8fba195a8d88"} Mar 19 11:52:12.956056 master-0 kubenswrapper[4029]: I0319 11:52:12.955590 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-552pc" event={"ID":"09a22c25-6073-4b1a-a029-928452ef37db","Type":"ContainerStarted","Data":"3064394565c9496e48894316f1957fce0c589d1189524c5a2447b9260378da26"} Mar 19 11:52:12.958180 master-0 kubenswrapper[4029]: I0319 11:52:12.958144 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" event={"ID":"daf4dbb6-5a0a-4c92-a930-479a7330ace1","Type":"ContainerStarted","Data":"bf6ad9dcfe2ae567e8d82f954f212af5ac25add37e64f5e40c91d6128ed379e3"} Mar 19 11:52:12.983873 master-0 kubenswrapper[4029]: I0319 11:52:12.983809 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8438d015-106b-4aed-ae12-dda781ce51fc-webhook-cert\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:52:12.984595 master-0 kubenswrapper[4029]: E0319 11:52:12.984565 4029 secret.go:189] Couldn't get secret openshift-network-node-identity/network-node-identity-cert: secret "network-node-identity-cert" not found Mar 19 11:52:12.984677 master-0 kubenswrapper[4029]: E0319 11:52:12.984613 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8438d015-106b-4aed-ae12-dda781ce51fc-webhook-cert podName:8438d015-106b-4aed-ae12-dda781ce51fc nodeName:}" failed. No retries permitted until 2026-03-19 11:52:13.984600312 +0000 UTC m=+79.061476879 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/8438d015-106b-4aed-ae12-dda781ce51fc-webhook-cert") pod "network-node-identity-j528w" (UID: "8438d015-106b-4aed-ae12-dda781ce51fc") : secret "network-node-identity-cert" not found Mar 19 11:52:13.077373 master-0 kubenswrapper[4029]: I0319 11:52:13.076720 4029 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-552pc" podStartSLOduration=1.775006562 podStartE2EDuration="21.07670331s" podCreationTimestamp="2026-03-19 11:51:52 +0000 UTC" firstStartedPulling="2026-03-19 11:51:52.636605752 +0000 UTC m=+57.713482319" lastFinishedPulling="2026-03-19 11:52:11.9383025 +0000 UTC m=+77.015179067" observedRunningTime="2026-03-19 11:52:13.076367223 +0000 UTC m=+78.153243790" watchObservedRunningTime="2026-03-19 11:52:13.07670331 +0000 UTC m=+78.153579877" Mar 19 11:52:13.653471 master-0 kubenswrapper[4029]: I0319 11:52:13.653417 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:13.653684 master-0 kubenswrapper[4029]: E0319 11:52:13.653558 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr8n7" podUID="6230ed8f-4608-4168-8f5a-656f411b0ef7" Mar 19 11:52:13.992606 master-0 kubenswrapper[4029]: I0319 11:52:13.992525 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8438d015-106b-4aed-ae12-dda781ce51fc-webhook-cert\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:52:13.998442 master-0 kubenswrapper[4029]: I0319 11:52:13.998399 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8438d015-106b-4aed-ae12-dda781ce51fc-webhook-cert\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:52:14.112901 master-0 kubenswrapper[4029]: I0319 11:52:14.112561 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:52:14.652456 master-0 kubenswrapper[4029]: I0319 11:52:14.652341 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:14.652833 master-0 kubenswrapper[4029]: E0319 11:52:14.652497 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:14.834865 master-0 kubenswrapper[4029]: W0319 11:52:14.834720 4029 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8438d015_106b_4aed_ae12_dda781ce51fc.slice/crio-23f47642dd95b352c86bf3516967ac9ae86ccfd441d6afb36a3e2d4a5c622a4a WatchSource:0}: Error finding container 23f47642dd95b352c86bf3516967ac9ae86ccfd441d6afb36a3e2d4a5c622a4a: Status 404 returned error can't find the container with id 23f47642dd95b352c86bf3516967ac9ae86ccfd441d6afb36a3e2d4a5c622a4a Mar 19 11:52:14.964161 master-0 kubenswrapper[4029]: I0319 11:52:14.964092 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-j528w" event={"ID":"8438d015-106b-4aed-ae12-dda781ce51fc","Type":"ContainerStarted","Data":"23f47642dd95b352c86bf3516967ac9ae86ccfd441d6afb36a3e2d4a5c622a4a"} Mar 19 11:52:15.653381 master-0 kubenswrapper[4029]: I0319 11:52:15.652469 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:15.653381 master-0 kubenswrapper[4029]: E0319 11:52:15.652912 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr8n7" podUID="6230ed8f-4608-4168-8f5a-656f411b0ef7" Mar 19 11:52:15.809041 master-0 kubenswrapper[4029]: I0319 11:52:15.808980 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzrh8\" (UniqueName: \"kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8\") pod \"network-check-target-cr8n7\" (UID: \"6230ed8f-4608-4168-8f5a-656f411b0ef7\") " pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:15.809238 master-0 kubenswrapper[4029]: E0319 11:52:15.809117 4029 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 11:52:15.809238 master-0 kubenswrapper[4029]: E0319 11:52:15.809132 4029 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 11:52:15.809238 master-0 kubenswrapper[4029]: E0319 11:52:15.809143 4029 projected.go:194] Error preparing data for projected volume kube-api-access-wzrh8 for pod openshift-network-diagnostics/network-check-target-cr8n7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:52:15.809238 master-0 kubenswrapper[4029]: E0319 11:52:15.809181 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8 podName:6230ed8f-4608-4168-8f5a-656f411b0ef7 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:23.809168209 +0000 UTC m=+88.886044776 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-wzrh8" (UniqueName: "kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8") pod "network-check-target-cr8n7" (UID: "6230ed8f-4608-4168-8f5a-656f411b0ef7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:52:15.969671 master-0 kubenswrapper[4029]: I0319 11:52:15.969568 4029 generic.go:334] "Generic (PLEG): container finished" podID="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" containerID="081b4d6f699ceead2b8cddd228d7b6dc1383135b83134925db54e215e05a85df" exitCode=0 Mar 19 11:52:15.969671 master-0 kubenswrapper[4029]: I0319 11:52:15.969603 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n8vwk" event={"ID":"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a","Type":"ContainerDied","Data":"081b4d6f699ceead2b8cddd228d7b6dc1383135b83134925db54e215e05a85df"} Mar 19 11:52:16.651976 master-0 kubenswrapper[4029]: I0319 11:52:16.651905 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:16.652283 master-0 kubenswrapper[4029]: E0319 11:52:16.652100 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:17.651789 master-0 kubenswrapper[4029]: I0319 11:52:17.651738 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:17.652271 master-0 kubenswrapper[4029]: E0319 11:52:17.651861 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr8n7" podUID="6230ed8f-4608-4168-8f5a-656f411b0ef7" Mar 19 11:52:18.651927 master-0 kubenswrapper[4029]: I0319 11:52:18.651873 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:18.653341 master-0 kubenswrapper[4029]: E0319 11:52:18.651973 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:19.652810 master-0 kubenswrapper[4029]: I0319 11:52:19.652637 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:19.653386 master-0 kubenswrapper[4029]: E0319 11:52:19.652816 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr8n7" podUID="6230ed8f-4608-4168-8f5a-656f411b0ef7" Mar 19 11:52:19.983070 master-0 kubenswrapper[4029]: I0319 11:52:19.983013 4029 generic.go:334] "Generic (PLEG): container finished" podID="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" containerID="e6b2ecdeb98ba4579257a0e7e4159cee8c04ebbb886d532c90b2d6925d5996ab" exitCode=0 Mar 19 11:52:19.983070 master-0 kubenswrapper[4029]: I0319 11:52:19.983053 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n8vwk" event={"ID":"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a","Type":"ContainerDied","Data":"e6b2ecdeb98ba4579257a0e7e4159cee8c04ebbb886d532c90b2d6925d5996ab"} Mar 19 11:52:20.652624 master-0 kubenswrapper[4029]: I0319 11:52:20.652548 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:20.652983 master-0 kubenswrapper[4029]: E0319 11:52:20.652788 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:21.652014 master-0 kubenswrapper[4029]: I0319 11:52:21.651946 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:21.652926 master-0 kubenswrapper[4029]: E0319 11:52:21.652836 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr8n7" podUID="6230ed8f-4608-4168-8f5a-656f411b0ef7" Mar 19 11:52:22.652414 master-0 kubenswrapper[4029]: I0319 11:52:22.652349 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:22.653077 master-0 kubenswrapper[4029]: E0319 11:52:22.652496 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:23.652938 master-0 kubenswrapper[4029]: I0319 11:52:23.652202 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:23.652938 master-0 kubenswrapper[4029]: E0319 11:52:23.652382 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr8n7" podUID="6230ed8f-4608-4168-8f5a-656f411b0ef7" Mar 19 11:52:23.823046 master-0 kubenswrapper[4029]: I0319 11:52:23.821986 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzrh8\" (UniqueName: \"kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8\") pod \"network-check-target-cr8n7\" (UID: \"6230ed8f-4608-4168-8f5a-656f411b0ef7\") " pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:23.823046 master-0 kubenswrapper[4029]: E0319 11:52:23.822291 4029 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 11:52:23.823046 master-0 kubenswrapper[4029]: E0319 11:52:23.822352 4029 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 11:52:23.823046 master-0 kubenswrapper[4029]: E0319 11:52:23.822373 4029 projected.go:194] Error preparing data for projected volume kube-api-access-wzrh8 for pod openshift-network-diagnostics/network-check-target-cr8n7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:52:23.823046 master-0 kubenswrapper[4029]: E0319 11:52:23.822485 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8 podName:6230ed8f-4608-4168-8f5a-656f411b0ef7 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:39.822456003 +0000 UTC m=+104.899332720 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-wzrh8" (UniqueName: "kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8") pod "network-check-target-cr8n7" (UID: "6230ed8f-4608-4168-8f5a-656f411b0ef7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:52:24.652437 master-0 kubenswrapper[4029]: I0319 11:52:24.651997 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:24.652437 master-0 kubenswrapper[4029]: E0319 11:52:24.652122 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:25.436755 master-0 kubenswrapper[4029]: I0319 11:52:25.436294 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:25.436755 master-0 kubenswrapper[4029]: E0319 11:52:25.436479 4029 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:52:25.436755 master-0 kubenswrapper[4029]: E0319 11:52:25.436561 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs podName:f29b11ce-60e0-46b3-8d28-eea3452513cd nodeName:}" failed. No retries permitted until 2026-03-19 11:52:57.436539546 +0000 UTC m=+122.513416113 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs") pod "network-metrics-daemon-f6wv7" (UID: "f29b11ce-60e0-46b3-8d28-eea3452513cd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:52:25.651964 master-0 kubenswrapper[4029]: I0319 11:52:25.651905 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:25.652670 master-0 kubenswrapper[4029]: E0319 11:52:25.652634 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr8n7" podUID="6230ed8f-4608-4168-8f5a-656f411b0ef7" Mar 19 11:52:26.652667 master-0 kubenswrapper[4029]: I0319 11:52:26.652582 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:26.653320 master-0 kubenswrapper[4029]: E0319 11:52:26.652850 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:27.030187 master-0 kubenswrapper[4029]: W0319 11:52:27.030138 4029 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 19 11:52:27.031912 master-0 kubenswrapper[4029]: I0319 11:52:27.031861 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 19 11:52:27.661395 master-0 kubenswrapper[4029]: I0319 11:52:27.661345 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:27.662168 master-0 kubenswrapper[4029]: E0319 11:52:27.661607 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr8n7" podUID="6230ed8f-4608-4168-8f5a-656f411b0ef7" Mar 19 11:52:28.652535 master-0 kubenswrapper[4029]: I0319 11:52:28.652468 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:28.652794 master-0 kubenswrapper[4029]: E0319 11:52:28.652634 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:29.652606 master-0 kubenswrapper[4029]: I0319 11:52:29.652551 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:29.653357 master-0 kubenswrapper[4029]: E0319 11:52:29.652667 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr8n7" podUID="6230ed8f-4608-4168-8f5a-656f411b0ef7" Mar 19 11:52:30.325541 master-0 kubenswrapper[4029]: I0319 11:52:30.325490 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 19 11:52:30.651780 master-0 kubenswrapper[4029]: I0319 11:52:30.651662 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:30.651949 master-0 kubenswrapper[4029]: E0319 11:52:30.651799 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:31.652212 master-0 kubenswrapper[4029]: I0319 11:52:31.652138 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:31.652913 master-0 kubenswrapper[4029]: E0319 11:52:31.652289 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr8n7" podUID="6230ed8f-4608-4168-8f5a-656f411b0ef7" Mar 19 11:52:32.652048 master-0 kubenswrapper[4029]: I0319 11:52:32.651965 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:32.652281 master-0 kubenswrapper[4029]: E0319 11:52:32.652161 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:33.472876 master-0 kubenswrapper[4029]: I0319 11:52:33.471591 4029 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-m724d"] Mar 19 11:52:33.652018 master-0 kubenswrapper[4029]: I0319 11:52:33.651928 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:33.652366 master-0 kubenswrapper[4029]: E0319 11:52:33.652228 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr8n7" podUID="6230ed8f-4608-4168-8f5a-656f411b0ef7" Mar 19 11:52:34.652022 master-0 kubenswrapper[4029]: I0319 11:52:34.651882 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:34.652218 master-0 kubenswrapper[4029]: E0319 11:52:34.652195 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:35.155647 master-0 kubenswrapper[4029]: I0319 11:52:35.155568 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n8vwk" event={"ID":"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a","Type":"ContainerStarted","Data":"a882ec3e14e198707c095bc0bdd34381c81e4c1697293837f13c4fc402ee5b87"} Mar 19 11:52:35.157017 master-0 kubenswrapper[4029]: I0319 11:52:35.156982 4029 generic.go:334] "Generic (PLEG): container finished" podID="9e5f2654-1acc-4938-8f86-ba5328fccfcc" containerID="43296dac019637a7ebee3fc43b3dbab032125407453d3a257a54beb675a6589b" exitCode=0 Mar 19 11:52:35.157157 master-0 kubenswrapper[4029]: I0319 11:52:35.157045 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m724d" event={"ID":"9e5f2654-1acc-4938-8f86-ba5328fccfcc","Type":"ContainerDied","Data":"43296dac019637a7ebee3fc43b3dbab032125407453d3a257a54beb675a6589b"} Mar 19 11:52:35.159566 master-0 kubenswrapper[4029]: I0319 11:52:35.159419 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" event={"ID":"daf4dbb6-5a0a-4c92-a930-479a7330ace1","Type":"ContainerStarted","Data":"1961370e7c6f3b39c50205c4d3f694632a63f87701e4b9c1a6a05e005ec065b1"} Mar 19 11:52:35.162328 master-0 kubenswrapper[4029]: I0319 11:52:35.162288 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-j528w" event={"ID":"8438d015-106b-4aed-ae12-dda781ce51fc","Type":"ContainerStarted","Data":"27aeacdf42166ebdfe7943145673659894eb1a05c94251adf45a06c9d05c04a8"} Mar 19 11:52:35.162328 master-0 kubenswrapper[4029]: I0319 11:52:35.162323 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-j528w" event={"ID":"8438d015-106b-4aed-ae12-dda781ce51fc","Type":"ContainerStarted","Data":"e31563606c69089c8d00a2fd1e9f5279c225bb040e3b5e23507eceec4a5f5634"} Mar 19 11:52:35.180710 master-0 kubenswrapper[4029]: I0319 11:52:35.180618 4029 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:35.228486 master-0 kubenswrapper[4029]: I0319 11:52:35.228423 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9e5f2654-1acc-4938-8f86-ba5328fccfcc-ovnkube-script-lib\") pod \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " Mar 19 11:52:35.228486 master-0 kubenswrapper[4029]: I0319 11:52:35.228467 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-systemd-units\") pod \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " Mar 19 11:52:35.228486 master-0 kubenswrapper[4029]: I0319 11:52:35.228488 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e5f2654-1acc-4938-8f86-ba5328fccfcc-ovn-node-metrics-cert\") pod \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " Mar 19 11:52:35.228486 master-0 kubenswrapper[4029]: I0319 11:52:35.228506 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-run-netns\") pod \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " Mar 19 11:52:35.229029 master-0 kubenswrapper[4029]: I0319 11:52:35.228524 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9e5f2654-1acc-4938-8f86-ba5328fccfcc-env-overrides\") pod \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " Mar 19 11:52:35.229029 master-0 kubenswrapper[4029]: I0319 11:52:35.228542 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-cni-bin\") pod \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " Mar 19 11:52:35.229029 master-0 kubenswrapper[4029]: I0319 11:52:35.228558 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-etc-openvswitch\") pod \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " Mar 19 11:52:35.229029 master-0 kubenswrapper[4029]: I0319 11:52:35.228572 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-run-ovn-kubernetes\") pod \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " Mar 19 11:52:35.229029 master-0 kubenswrapper[4029]: I0319 11:52:35.228587 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-node-log\") pod \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " Mar 19 11:52:35.229029 master-0 kubenswrapper[4029]: I0319 11:52:35.228602 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-run-openvswitch\") pod \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " Mar 19 11:52:35.229029 master-0 kubenswrapper[4029]: I0319 11:52:35.228621 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-run-systemd\") pod \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " Mar 19 11:52:35.229029 master-0 kubenswrapper[4029]: I0319 11:52:35.228641 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-slash\") pod \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " Mar 19 11:52:35.229029 master-0 kubenswrapper[4029]: I0319 11:52:35.228679 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-slash" (OuterVolumeSpecName: "host-slash") pod "9e5f2654-1acc-4938-8f86-ba5328fccfcc" (UID: "9e5f2654-1acc-4938-8f86-ba5328fccfcc"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:52:35.229029 master-0 kubenswrapper[4029]: I0319 11:52:35.228708 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "9e5f2654-1acc-4938-8f86-ba5328fccfcc" (UID: "9e5f2654-1acc-4938-8f86-ba5328fccfcc"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:52:35.229029 master-0 kubenswrapper[4029]: I0319 11:52:35.228740 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "9e5f2654-1acc-4938-8f86-ba5328fccfcc" (UID: "9e5f2654-1acc-4938-8f86-ba5328fccfcc"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:52:35.229029 master-0 kubenswrapper[4029]: I0319 11:52:35.228758 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "9e5f2654-1acc-4938-8f86-ba5328fccfcc" (UID: "9e5f2654-1acc-4938-8f86-ba5328fccfcc"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:52:35.229029 master-0 kubenswrapper[4029]: I0319 11:52:35.228775 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "9e5f2654-1acc-4938-8f86-ba5328fccfcc" (UID: "9e5f2654-1acc-4938-8f86-ba5328fccfcc"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:52:35.229029 master-0 kubenswrapper[4029]: I0319 11:52:35.228782 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "9e5f2654-1acc-4938-8f86-ba5328fccfcc" (UID: "9e5f2654-1acc-4938-8f86-ba5328fccfcc"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:52:35.229029 master-0 kubenswrapper[4029]: I0319 11:52:35.228801 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "9e5f2654-1acc-4938-8f86-ba5328fccfcc" (UID: "9e5f2654-1acc-4938-8f86-ba5328fccfcc"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:52:35.229029 master-0 kubenswrapper[4029]: I0319 11:52:35.228820 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-node-log" (OuterVolumeSpecName: "node-log") pod "9e5f2654-1acc-4938-8f86-ba5328fccfcc" (UID: "9e5f2654-1acc-4938-8f86-ba5328fccfcc"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:52:35.229029 master-0 kubenswrapper[4029]: I0319 11:52:35.228892 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-var-lib-openvswitch\") pod \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " Mar 19 11:52:35.230115 master-0 kubenswrapper[4029]: I0319 11:52:35.228710 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "9e5f2654-1acc-4938-8f86-ba5328fccfcc" (UID: "9e5f2654-1acc-4938-8f86-ba5328fccfcc"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:52:35.230115 master-0 kubenswrapper[4029]: I0319 11:52:35.228952 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-cni-netd\") pod \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " Mar 19 11:52:35.230115 master-0 kubenswrapper[4029]: I0319 11:52:35.228998 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zf7\" (UniqueName: \"kubernetes.io/projected/9e5f2654-1acc-4938-8f86-ba5328fccfcc-kube-api-access-x7zf7\") pod \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " Mar 19 11:52:35.230115 master-0 kubenswrapper[4029]: I0319 11:52:35.229015 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " Mar 19 11:52:35.230115 master-0 kubenswrapper[4029]: I0319 11:52:35.229039 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-log-socket\") pod \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " Mar 19 11:52:35.230115 master-0 kubenswrapper[4029]: I0319 11:52:35.229061 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-run-ovn\") pod \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " Mar 19 11:52:35.230115 master-0 kubenswrapper[4029]: I0319 11:52:35.228928 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "9e5f2654-1acc-4938-8f86-ba5328fccfcc" (UID: "9e5f2654-1acc-4938-8f86-ba5328fccfcc"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:52:35.230115 master-0 kubenswrapper[4029]: I0319 11:52:35.228978 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "9e5f2654-1acc-4938-8f86-ba5328fccfcc" (UID: "9e5f2654-1acc-4938-8f86-ba5328fccfcc"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:52:35.230115 master-0 kubenswrapper[4029]: I0319 11:52:35.229072 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "9e5f2654-1acc-4938-8f86-ba5328fccfcc" (UID: "9e5f2654-1acc-4938-8f86-ba5328fccfcc"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:52:35.230115 master-0 kubenswrapper[4029]: I0319 11:52:35.229109 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "9e5f2654-1acc-4938-8f86-ba5328fccfcc" (UID: "9e5f2654-1acc-4938-8f86-ba5328fccfcc"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:52:35.230115 master-0 kubenswrapper[4029]: I0319 11:52:35.229136 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9e5f2654-1acc-4938-8f86-ba5328fccfcc-ovnkube-config\") pod \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " Mar 19 11:52:35.230115 master-0 kubenswrapper[4029]: I0319 11:52:35.229197 4029 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-kubelet\") pod \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\" (UID: \"9e5f2654-1acc-4938-8f86-ba5328fccfcc\") " Mar 19 11:52:35.230115 master-0 kubenswrapper[4029]: I0319 11:52:35.229110 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-log-socket" (OuterVolumeSpecName: "log-socket") pod "9e5f2654-1acc-4938-8f86-ba5328fccfcc" (UID: "9e5f2654-1acc-4938-8f86-ba5328fccfcc"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:52:35.230115 master-0 kubenswrapper[4029]: I0319 11:52:35.229191 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e5f2654-1acc-4938-8f86-ba5328fccfcc-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "9e5f2654-1acc-4938-8f86-ba5328fccfcc" (UID: "9e5f2654-1acc-4938-8f86-ba5328fccfcc"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:52:35.230115 master-0 kubenswrapper[4029]: I0319 11:52:35.229220 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "9e5f2654-1acc-4938-8f86-ba5328fccfcc" (UID: "9e5f2654-1acc-4938-8f86-ba5328fccfcc"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:52:35.230115 master-0 kubenswrapper[4029]: I0319 11:52:35.229581 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e5f2654-1acc-4938-8f86-ba5328fccfcc-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "9e5f2654-1acc-4938-8f86-ba5328fccfcc" (UID: "9e5f2654-1acc-4938-8f86-ba5328fccfcc"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:52:35.231202 master-0 kubenswrapper[4029]: I0319 11:52:35.229874 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e5f2654-1acc-4938-8f86-ba5328fccfcc-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "9e5f2654-1acc-4938-8f86-ba5328fccfcc" (UID: "9e5f2654-1acc-4938-8f86-ba5328fccfcc"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:52:35.231202 master-0 kubenswrapper[4029]: I0319 11:52:35.230081 4029 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9e5f2654-1acc-4938-8f86-ba5328fccfcc-ovnkube-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:35.231202 master-0 kubenswrapper[4029]: I0319 11:52:35.230310 4029 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-kubelet\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:35.231202 master-0 kubenswrapper[4029]: I0319 11:52:35.230323 4029 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-systemd-units\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:35.231202 master-0 kubenswrapper[4029]: I0319 11:52:35.230336 4029 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-run-netns\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:35.231202 master-0 kubenswrapper[4029]: I0319 11:52:35.230346 4029 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9e5f2654-1acc-4938-8f86-ba5328fccfcc-env-overrides\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:35.231202 master-0 kubenswrapper[4029]: I0319 11:52:35.230355 4029 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-cni-bin\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:35.231202 master-0 kubenswrapper[4029]: I0319 11:52:35.230364 4029 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-etc-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:35.231202 master-0 kubenswrapper[4029]: I0319 11:52:35.230374 4029 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-run-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:35.231202 master-0 kubenswrapper[4029]: I0319 11:52:35.230390 4029 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-node-log\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:35.231202 master-0 kubenswrapper[4029]: I0319 11:52:35.230399 4029 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-run-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:35.231202 master-0 kubenswrapper[4029]: I0319 11:52:35.230410 4029 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-run-systemd\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:35.231202 master-0 kubenswrapper[4029]: I0319 11:52:35.230419 4029 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-slash\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:35.231202 master-0 kubenswrapper[4029]: I0319 11:52:35.230428 4029 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-var-lib-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:35.231202 master-0 kubenswrapper[4029]: I0319 11:52:35.230437 4029 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-cni-netd\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:35.231202 master-0 kubenswrapper[4029]: I0319 11:52:35.230447 4029 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-host-var-lib-cni-networks-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:35.231202 master-0 kubenswrapper[4029]: I0319 11:52:35.230474 4029 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-log-socket\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:35.231202 master-0 kubenswrapper[4029]: I0319 11:52:35.230484 4029 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9e5f2654-1acc-4938-8f86-ba5328fccfcc-run-ovn\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:35.234051 master-0 kubenswrapper[4029]: I0319 11:52:35.233994 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e5f2654-1acc-4938-8f86-ba5328fccfcc-kube-api-access-x7zf7" (OuterVolumeSpecName: "kube-api-access-x7zf7") pod "9e5f2654-1acc-4938-8f86-ba5328fccfcc" (UID: "9e5f2654-1acc-4938-8f86-ba5328fccfcc"). InnerVolumeSpecName "kube-api-access-x7zf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:52:35.234140 master-0 kubenswrapper[4029]: I0319 11:52:35.234019 4029 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e5f2654-1acc-4938-8f86-ba5328fccfcc-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "9e5f2654-1acc-4938-8f86-ba5328fccfcc" (UID: "9e5f2654-1acc-4938-8f86-ba5328fccfcc"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:52:35.331286 master-0 kubenswrapper[4029]: I0319 11:52:35.331210 4029 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9e5f2654-1acc-4938-8f86-ba5328fccfcc-ovnkube-script-lib\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:35.331286 master-0 kubenswrapper[4029]: I0319 11:52:35.331271 4029 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e5f2654-1acc-4938-8f86-ba5328fccfcc-ovn-node-metrics-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:35.331286 master-0 kubenswrapper[4029]: I0319 11:52:35.331282 4029 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zf7\" (UniqueName: \"kubernetes.io/projected/9e5f2654-1acc-4938-8f86-ba5328fccfcc-kube-api-access-x7zf7\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:35.433873 master-0 kubenswrapper[4029]: I0319 11:52:35.433801 4029 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-controller-manager-master-0" podStartSLOduration=6.4337848730000005 podStartE2EDuration="6.433784873s" podCreationTimestamp="2026-03-19 11:52:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:52:35.433617149 +0000 UTC m=+100.510493716" watchObservedRunningTime="2026-03-19 11:52:35.433784873 +0000 UTC m=+100.510661440" Mar 19 11:52:35.496199 master-0 kubenswrapper[4029]: I0319 11:52:35.495226 4029 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0-master-0" podStartSLOduration=9.495209329 podStartE2EDuration="9.495209329s" podCreationTimestamp="2026-03-19 11:52:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:52:35.474026707 +0000 UTC m=+100.550903294" watchObservedRunningTime="2026-03-19 11:52:35.495209329 +0000 UTC m=+100.572085906" Mar 19 11:52:35.524554 master-0 kubenswrapper[4029]: I0319 11:52:35.523554 4029 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" podStartSLOduration=8.952758752 podStartE2EDuration="31.523476055s" podCreationTimestamp="2026-03-19 11:52:04 +0000 UTC" firstStartedPulling="2026-03-19 11:52:12.158931692 +0000 UTC m=+77.235808259" lastFinishedPulling="2026-03-19 11:52:34.729648995 +0000 UTC m=+99.806525562" observedRunningTime="2026-03-19 11:52:35.522646926 +0000 UTC m=+100.599523493" watchObservedRunningTime="2026-03-19 11:52:35.523476055 +0000 UTC m=+100.600352622" Mar 19 11:52:35.524554 master-0 kubenswrapper[4029]: I0319 11:52:35.523769 4029 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-node-identity/network-node-identity-j528w" podStartSLOduration=3.666280433 podStartE2EDuration="23.523762812s" podCreationTimestamp="2026-03-19 11:52:12 +0000 UTC" firstStartedPulling="2026-03-19 11:52:14.836318743 +0000 UTC m=+79.913195310" lastFinishedPulling="2026-03-19 11:52:34.693801132 +0000 UTC m=+99.770677689" observedRunningTime="2026-03-19 11:52:35.507091315 +0000 UTC m=+100.583967882" watchObservedRunningTime="2026-03-19 11:52:35.523762812 +0000 UTC m=+100.600639379" Mar 19 11:52:35.652457 master-0 kubenswrapper[4029]: I0319 11:52:35.652394 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:35.652992 master-0 kubenswrapper[4029]: E0319 11:52:35.652954 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr8n7" podUID="6230ed8f-4608-4168-8f5a-656f411b0ef7" Mar 19 11:52:36.167291 master-0 kubenswrapper[4029]: I0319 11:52:36.166959 4029 generic.go:334] "Generic (PLEG): container finished" podID="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" containerID="a882ec3e14e198707c095bc0bdd34381c81e4c1697293837f13c4fc402ee5b87" exitCode=0 Mar 19 11:52:36.167291 master-0 kubenswrapper[4029]: I0319 11:52:36.167015 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n8vwk" event={"ID":"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a","Type":"ContainerDied","Data":"a882ec3e14e198707c095bc0bdd34381c81e4c1697293837f13c4fc402ee5b87"} Mar 19 11:52:36.170245 master-0 kubenswrapper[4029]: I0319 11:52:36.169876 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m724d" event={"ID":"9e5f2654-1acc-4938-8f86-ba5328fccfcc","Type":"ContainerDied","Data":"ac789a38dd13a595ac15b76c4a526da0f4604507ea8e8d54ee1ca913a0fc96b9"} Mar 19 11:52:36.170245 master-0 kubenswrapper[4029]: I0319 11:52:36.169917 4029 scope.go:117] "RemoveContainer" containerID="43296dac019637a7ebee3fc43b3dbab032125407453d3a257a54beb675a6589b" Mar 19 11:52:36.170245 master-0 kubenswrapper[4029]: I0319 11:52:36.169978 4029 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m724d" Mar 19 11:52:36.234940 master-0 kubenswrapper[4029]: I0319 11:52:36.234891 4029 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-m724d"] Mar 19 11:52:36.241962 master-0 kubenswrapper[4029]: I0319 11:52:36.237709 4029 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-m724d"] Mar 19 11:52:36.254791 master-0 kubenswrapper[4029]: I0319 11:52:36.249864 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4qxkd"] Mar 19 11:52:36.254791 master-0 kubenswrapper[4029]: E0319 11:52:36.250345 4029 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e5f2654-1acc-4938-8f86-ba5328fccfcc" containerName="kubecfg-setup" Mar 19 11:52:36.254791 master-0 kubenswrapper[4029]: I0319 11:52:36.250360 4029 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e5f2654-1acc-4938-8f86-ba5328fccfcc" containerName="kubecfg-setup" Mar 19 11:52:36.254791 master-0 kubenswrapper[4029]: I0319 11:52:36.250431 4029 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e5f2654-1acc-4938-8f86-ba5328fccfcc" containerName="kubecfg-setup" Mar 19 11:52:36.254791 master-0 kubenswrapper[4029]: I0319 11:52:36.251246 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.255943 master-0 kubenswrapper[4029]: I0319 11:52:36.255686 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 11:52:36.255943 master-0 kubenswrapper[4029]: I0319 11:52:36.255715 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 11:52:36.342764 master-0 kubenswrapper[4029]: I0319 11:52:36.342588 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-run-ovn-kubernetes\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.342764 master-0 kubenswrapper[4029]: I0319 11:52:36.342640 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-run-netns\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.342764 master-0 kubenswrapper[4029]: I0319 11:52:36.342662 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-ovnkube-script-lib\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.342764 master-0 kubenswrapper[4029]: I0319 11:52:36.342772 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-etc-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.343288 master-0 kubenswrapper[4029]: I0319 11:52:36.342826 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-node-log\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.343288 master-0 kubenswrapper[4029]: I0319 11:52:36.342851 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-kubelet\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.343288 master-0 kubenswrapper[4029]: I0319 11:52:36.342890 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-var-lib-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.343288 master-0 kubenswrapper[4029]: I0319 11:52:36.342916 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.343288 master-0 kubenswrapper[4029]: I0319 11:52:36.342991 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-log-socket\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.343288 master-0 kubenswrapper[4029]: I0319 11:52:36.343048 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3053504d-0734-4def-b639-0f5cc2178185-ovn-node-metrics-cert\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.343288 master-0 kubenswrapper[4029]: I0319 11:52:36.343073 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-env-overrides\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.343288 master-0 kubenswrapper[4029]: I0319 11:52:36.343094 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-ovnkube-config\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.343288 master-0 kubenswrapper[4029]: I0319 11:52:36.343161 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bb2x\" (UniqueName: \"kubernetes.io/projected/3053504d-0734-4def-b639-0f5cc2178185-kube-api-access-2bb2x\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.343288 master-0 kubenswrapper[4029]: I0319 11:52:36.343183 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.343288 master-0 kubenswrapper[4029]: I0319 11:52:36.343206 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-systemd-units\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.343288 master-0 kubenswrapper[4029]: I0319 11:52:36.343222 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-slash\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.343288 master-0 kubenswrapper[4029]: I0319 11:52:36.343237 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-cni-bin\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.343288 master-0 kubenswrapper[4029]: I0319 11:52:36.343301 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-cni-netd\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.343682 master-0 kubenswrapper[4029]: I0319 11:52:36.343339 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-systemd\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.343682 master-0 kubenswrapper[4029]: I0319 11:52:36.343359 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-ovn\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.444131 master-0 kubenswrapper[4029]: I0319 11:52:36.444006 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-ovnkube-script-lib\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.444131 master-0 kubenswrapper[4029]: I0319 11:52:36.444058 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-etc-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.444131 master-0 kubenswrapper[4029]: I0319 11:52:36.444076 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-node-log\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.444131 master-0 kubenswrapper[4029]: I0319 11:52:36.444093 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-kubelet\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.444492 master-0 kubenswrapper[4029]: I0319 11:52:36.444434 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-log-socket\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.444528 master-0 kubenswrapper[4029]: I0319 11:52:36.444516 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-var-lib-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.444560 master-0 kubenswrapper[4029]: I0319 11:52:36.444538 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.444594 master-0 kubenswrapper[4029]: I0319 11:52:36.444560 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3053504d-0734-4def-b639-0f5cc2178185-ovn-node-metrics-cert\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.444622 master-0 kubenswrapper[4029]: I0319 11:52:36.444594 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-env-overrides\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.444622 master-0 kubenswrapper[4029]: I0319 11:52:36.444615 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-ovnkube-config\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.444678 master-0 kubenswrapper[4029]: I0319 11:52:36.444636 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bb2x\" (UniqueName: \"kubernetes.io/projected/3053504d-0734-4def-b639-0f5cc2178185-kube-api-access-2bb2x\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.444678 master-0 kubenswrapper[4029]: I0319 11:52:36.444658 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-systemd-units\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.444678 master-0 kubenswrapper[4029]: I0319 11:52:36.444675 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-slash\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.444780 master-0 kubenswrapper[4029]: I0319 11:52:36.444695 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.444780 master-0 kubenswrapper[4029]: I0319 11:52:36.444749 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-cni-bin\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.444840 master-0 kubenswrapper[4029]: I0319 11:52:36.444801 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-cni-netd\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.444840 master-0 kubenswrapper[4029]: I0319 11:52:36.444831 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-systemd\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.444895 master-0 kubenswrapper[4029]: I0319 11:52:36.444851 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-ovn\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.444895 master-0 kubenswrapper[4029]: I0319 11:52:36.444878 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-run-netns\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.444948 master-0 kubenswrapper[4029]: I0319 11:52:36.444897 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-run-ovn-kubernetes\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.445351 master-0 kubenswrapper[4029]: I0319 11:52:36.445003 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-run-ovn-kubernetes\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.445351 master-0 kubenswrapper[4029]: I0319 11:52:36.445044 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-cni-bin\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.445351 master-0 kubenswrapper[4029]: I0319 11:52:36.445094 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-node-log\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.445351 master-0 kubenswrapper[4029]: I0319 11:52:36.445143 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-systemd\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.445351 master-0 kubenswrapper[4029]: I0319 11:52:36.445167 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-systemd-units\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.445351 master-0 kubenswrapper[4029]: I0319 11:52:36.445162 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.445351 master-0 kubenswrapper[4029]: I0319 11:52:36.445199 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.445351 master-0 kubenswrapper[4029]: I0319 11:52:36.445233 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-run-netns\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.445351 master-0 kubenswrapper[4029]: I0319 11:52:36.445200 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-ovn\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.445351 master-0 kubenswrapper[4029]: I0319 11:52:36.445258 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-cni-netd\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.445351 master-0 kubenswrapper[4029]: I0319 11:52:36.445287 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-kubelet\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.445351 master-0 kubenswrapper[4029]: I0319 11:52:36.445298 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-log-socket\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.446352 master-0 kubenswrapper[4029]: I0319 11:52:36.445751 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-var-lib-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.446352 master-0 kubenswrapper[4029]: I0319 11:52:36.445791 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-slash\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.446352 master-0 kubenswrapper[4029]: I0319 11:52:36.445848 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-etc-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.446352 master-0 kubenswrapper[4029]: I0319 11:52:36.445981 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-ovnkube-config\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.446352 master-0 kubenswrapper[4029]: I0319 11:52:36.446114 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-ovnkube-script-lib\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.446352 master-0 kubenswrapper[4029]: I0319 11:52:36.446215 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-env-overrides\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.454335 master-0 kubenswrapper[4029]: I0319 11:52:36.453274 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3053504d-0734-4def-b639-0f5cc2178185-ovn-node-metrics-cert\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.465800 master-0 kubenswrapper[4029]: I0319 11:52:36.465296 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bb2x\" (UniqueName: \"kubernetes.io/projected/3053504d-0734-4def-b639-0f5cc2178185-kube-api-access-2bb2x\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.565548 master-0 kubenswrapper[4029]: I0319 11:52:36.565455 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:36.581255 master-0 kubenswrapper[4029]: W0319 11:52:36.581210 4029 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3053504d_0734_4def_b639_0f5cc2178185.slice/crio-2151069adcf5d6126fb57190dc2ec941b6dc342421174da0283b995f56e1641b WatchSource:0}: Error finding container 2151069adcf5d6126fb57190dc2ec941b6dc342421174da0283b995f56e1641b: Status 404 returned error can't find the container with id 2151069adcf5d6126fb57190dc2ec941b6dc342421174da0283b995f56e1641b Mar 19 11:52:36.652863 master-0 kubenswrapper[4029]: I0319 11:52:36.652107 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:36.652863 master-0 kubenswrapper[4029]: E0319 11:52:36.652248 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:37.175559 master-0 kubenswrapper[4029]: I0319 11:52:37.175484 4029 generic.go:334] "Generic (PLEG): container finished" podID="3053504d-0734-4def-b639-0f5cc2178185" containerID="3e8362d7d083774070cfab73695a0128d3b617dc47c3ad8cda98be3e5d078943" exitCode=0 Mar 19 11:52:37.175559 master-0 kubenswrapper[4029]: I0319 11:52:37.175562 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" event={"ID":"3053504d-0734-4def-b639-0f5cc2178185","Type":"ContainerDied","Data":"3e8362d7d083774070cfab73695a0128d3b617dc47c3ad8cda98be3e5d078943"} Mar 19 11:52:37.176165 master-0 kubenswrapper[4029]: I0319 11:52:37.175587 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" event={"ID":"3053504d-0734-4def-b639-0f5cc2178185","Type":"ContainerStarted","Data":"2151069adcf5d6126fb57190dc2ec941b6dc342421174da0283b995f56e1641b"} Mar 19 11:52:37.180745 master-0 kubenswrapper[4029]: I0319 11:52:37.180661 4029 generic.go:334] "Generic (PLEG): container finished" podID="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" containerID="ec3103cf568fabdd9da2c1fe1b486c6e0c444ae0adfa29f7784e8224f29d03a4" exitCode=0 Mar 19 11:52:37.180745 master-0 kubenswrapper[4029]: I0319 11:52:37.180716 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n8vwk" event={"ID":"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a","Type":"ContainerDied","Data":"ec3103cf568fabdd9da2c1fe1b486c6e0c444ae0adfa29f7784e8224f29d03a4"} Mar 19 11:52:37.654745 master-0 kubenswrapper[4029]: I0319 11:52:37.654023 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:37.655126 master-0 kubenswrapper[4029]: E0319 11:52:37.655084 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr8n7" podUID="6230ed8f-4608-4168-8f5a-656f411b0ef7" Mar 19 11:52:37.660268 master-0 kubenswrapper[4029]: I0319 11:52:37.660239 4029 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e5f2654-1acc-4938-8f86-ba5328fccfcc" path="/var/lib/kubelet/pods/9e5f2654-1acc-4938-8f86-ba5328fccfcc/volumes" Mar 19 11:52:37.699530 master-0 kubenswrapper[4029]: I0319 11:52:37.699478 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 19 11:52:38.187241 master-0 kubenswrapper[4029]: I0319 11:52:38.187185 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" event={"ID":"3053504d-0734-4def-b639-0f5cc2178185","Type":"ContainerStarted","Data":"69acc672b794da7a54ce2775cead7e2b505f34e663bdebb58fe9d0f333bb3a17"} Mar 19 11:52:38.187241 master-0 kubenswrapper[4029]: I0319 11:52:38.187231 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" event={"ID":"3053504d-0734-4def-b639-0f5cc2178185","Type":"ContainerStarted","Data":"77b629acfa6568c86fcb7c434b052460e5e82fda8442020ecf38464aea02d394"} Mar 19 11:52:38.187241 master-0 kubenswrapper[4029]: I0319 11:52:38.187244 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" event={"ID":"3053504d-0734-4def-b639-0f5cc2178185","Type":"ContainerStarted","Data":"16e762e62040a8ab32abf68e50548622fba3368bdbd6aaa182d992de7cfb6d2c"} Mar 19 11:52:38.187241 master-0 kubenswrapper[4029]: I0319 11:52:38.187254 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" event={"ID":"3053504d-0734-4def-b639-0f5cc2178185","Type":"ContainerStarted","Data":"66eb07e350d592fa1d3841026448cdd7f9668f512175189ce71899749dff08f2"} Mar 19 11:52:38.188952 master-0 kubenswrapper[4029]: I0319 11:52:38.187267 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" event={"ID":"3053504d-0734-4def-b639-0f5cc2178185","Type":"ContainerStarted","Data":"3b30bd392089d541d996c773ae7a567d020c4dd6ea9a9bceb548de0f02122e11"} Mar 19 11:52:38.188952 master-0 kubenswrapper[4029]: I0319 11:52:38.187277 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" event={"ID":"3053504d-0734-4def-b639-0f5cc2178185","Type":"ContainerStarted","Data":"599660ab2924a6760194b3d8ab83479d1f28259db6b12974470e957a83bf55b2"} Mar 19 11:52:38.189517 master-0 kubenswrapper[4029]: I0319 11:52:38.189408 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n8vwk" event={"ID":"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a","Type":"ContainerStarted","Data":"cd99fc74a8cb49942d257386135c6df52b6cfdbc964f50ca809882e7d4215d08"} Mar 19 11:52:38.211493 master-0 kubenswrapper[4029]: I0319 11:52:38.211385 4029 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-n8vwk" podStartSLOduration=4.366261874 podStartE2EDuration="46.211348888s" podCreationTimestamp="2026-03-19 11:51:52 +0000 UTC" firstStartedPulling="2026-03-19 11:51:52.829460151 +0000 UTC m=+57.906336718" lastFinishedPulling="2026-03-19 11:52:34.674547165 +0000 UTC m=+99.751423732" observedRunningTime="2026-03-19 11:52:38.210558121 +0000 UTC m=+103.287434708" watchObservedRunningTime="2026-03-19 11:52:38.211348888 +0000 UTC m=+103.288225465" Mar 19 11:52:38.230309 master-0 kubenswrapper[4029]: I0319 11:52:38.230197 4029 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podStartSLOduration=1.230177256 podStartE2EDuration="1.230177256s" podCreationTimestamp="2026-03-19 11:52:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:52:38.229880119 +0000 UTC m=+103.306756696" watchObservedRunningTime="2026-03-19 11:52:38.230177256 +0000 UTC m=+103.307053823" Mar 19 11:52:38.468348 master-0 kubenswrapper[4029]: I0319 11:52:38.468164 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:52:38.468594 master-0 kubenswrapper[4029]: E0319 11:52:38.468453 4029 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:52:38.468639 master-0 kubenswrapper[4029]: E0319 11:52:38.468594 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert podName:333047c4-aeca-410e-9393-ca4e74366921 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:42.46855718 +0000 UTC m=+167.545433917 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert") pod "cluster-version-operator-56d8475767-pk574" (UID: "333047c4-aeca-410e-9393-ca4e74366921") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:52:38.652544 master-0 kubenswrapper[4029]: I0319 11:52:38.652439 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:38.653256 master-0 kubenswrapper[4029]: E0319 11:52:38.652708 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:39.652485 master-0 kubenswrapper[4029]: I0319 11:52:39.652394 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:39.653143 master-0 kubenswrapper[4029]: E0319 11:52:39.652608 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr8n7" podUID="6230ed8f-4608-4168-8f5a-656f411b0ef7" Mar 19 11:52:39.882497 master-0 kubenswrapper[4029]: I0319 11:52:39.882400 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzrh8\" (UniqueName: \"kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8\") pod \"network-check-target-cr8n7\" (UID: \"6230ed8f-4608-4168-8f5a-656f411b0ef7\") " pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:39.882846 master-0 kubenswrapper[4029]: E0319 11:52:39.882793 4029 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 11:52:39.882884 master-0 kubenswrapper[4029]: E0319 11:52:39.882846 4029 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 11:52:39.882884 master-0 kubenswrapper[4029]: E0319 11:52:39.882867 4029 projected.go:194] Error preparing data for projected volume kube-api-access-wzrh8 for pod openshift-network-diagnostics/network-check-target-cr8n7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:52:39.882991 master-0 kubenswrapper[4029]: E0319 11:52:39.882974 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8 podName:6230ed8f-4608-4168-8f5a-656f411b0ef7 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:11.882944547 +0000 UTC m=+136.959821264 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-wzrh8" (UniqueName: "kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8") pod "network-check-target-cr8n7" (UID: "6230ed8f-4608-4168-8f5a-656f411b0ef7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:52:40.201309 master-0 kubenswrapper[4029]: I0319 11:52:40.201077 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" event={"ID":"3053504d-0734-4def-b639-0f5cc2178185","Type":"ContainerStarted","Data":"70b587dbc5ca47ec716639b420c023f4a654f956f41fe350e77ee16da5de9df3"} Mar 19 11:52:40.652536 master-0 kubenswrapper[4029]: I0319 11:52:40.652376 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:40.653294 master-0 kubenswrapper[4029]: E0319 11:52:40.653203 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:41.652492 master-0 kubenswrapper[4029]: I0319 11:52:41.652425 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:41.652908 master-0 kubenswrapper[4029]: E0319 11:52:41.652651 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr8n7" podUID="6230ed8f-4608-4168-8f5a-656f411b0ef7" Mar 19 11:52:42.652987 master-0 kubenswrapper[4029]: I0319 11:52:42.652289 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:42.652987 master-0 kubenswrapper[4029]: E0319 11:52:42.652683 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:43.211854 master-0 kubenswrapper[4029]: I0319 11:52:43.211784 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" event={"ID":"3053504d-0734-4def-b639-0f5cc2178185","Type":"ContainerStarted","Data":"295cde2cdc2acfd0e25db4552815fcaae5754248671cf1d11ec1c2218efc536e"} Mar 19 11:52:43.212328 master-0 kubenswrapper[4029]: I0319 11:52:43.212179 4029 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:43.212328 master-0 kubenswrapper[4029]: I0319 11:52:43.212206 4029 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:43.236717 master-0 kubenswrapper[4029]: I0319 11:52:43.235673 4029 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:43.236931 master-0 kubenswrapper[4029]: I0319 11:52:43.236879 4029 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" podStartSLOduration=7.236866174 podStartE2EDuration="7.236866174s" podCreationTimestamp="2026-03-19 11:52:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:52:43.236579368 +0000 UTC m=+108.313455935" watchObservedRunningTime="2026-03-19 11:52:43.236866174 +0000 UTC m=+108.313742741" Mar 19 11:52:43.657198 master-0 kubenswrapper[4029]: I0319 11:52:43.655742 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:43.657198 master-0 kubenswrapper[4029]: E0319 11:52:43.655840 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr8n7" podUID="6230ed8f-4608-4168-8f5a-656f411b0ef7" Mar 19 11:52:44.218907 master-0 kubenswrapper[4029]: I0319 11:52:44.218844 4029 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:44.240703 master-0 kubenswrapper[4029]: I0319 11:52:44.240646 4029 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:44.420948 master-0 kubenswrapper[4029]: I0319 11:52:44.420892 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cr8n7"] Mar 19 11:52:44.421137 master-0 kubenswrapper[4029]: I0319 11:52:44.421045 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:44.421255 master-0 kubenswrapper[4029]: E0319 11:52:44.421203 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr8n7" podUID="6230ed8f-4608-4168-8f5a-656f411b0ef7" Mar 19 11:52:44.425252 master-0 kubenswrapper[4029]: I0319 11:52:44.425197 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-f6wv7"] Mar 19 11:52:44.425375 master-0 kubenswrapper[4029]: I0319 11:52:44.425298 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:44.425430 master-0 kubenswrapper[4029]: E0319 11:52:44.425407 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:45.651967 master-0 kubenswrapper[4029]: I0319 11:52:45.651882 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:45.652760 master-0 kubenswrapper[4029]: E0319 11:52:45.652605 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr8n7" podUID="6230ed8f-4608-4168-8f5a-656f411b0ef7" Mar 19 11:52:46.652432 master-0 kubenswrapper[4029]: I0319 11:52:46.652145 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:46.653007 master-0 kubenswrapper[4029]: E0319 11:52:46.652524 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:47.652425 master-0 kubenswrapper[4029]: I0319 11:52:47.652346 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:47.653129 master-0 kubenswrapper[4029]: E0319 11:52:47.652494 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr8n7" podUID="6230ed8f-4608-4168-8f5a-656f411b0ef7" Mar 19 11:52:48.652426 master-0 kubenswrapper[4029]: I0319 11:52:48.652319 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:48.652426 master-0 kubenswrapper[4029]: E0319 11:52:48.652462 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6wv7" podUID="f29b11ce-60e0-46b3-8d28-eea3452513cd" Mar 19 11:52:49.652687 master-0 kubenswrapper[4029]: I0319 11:52:49.652601 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:49.653335 master-0 kubenswrapper[4029]: E0319 11:52:49.652809 4029 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr8n7" podUID="6230ed8f-4608-4168-8f5a-656f411b0ef7" Mar 19 11:52:49.779077 master-0 kubenswrapper[4029]: I0319 11:52:49.779003 4029 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeReady" Mar 19 11:52:49.779323 master-0 kubenswrapper[4029]: I0319 11:52:49.779210 4029 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Mar 19 11:52:49.815797 master-0 kubenswrapper[4029]: I0319 11:52:49.815606 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5"] Mar 19 11:52:49.816414 master-0 kubenswrapper[4029]: I0319 11:52:49.816358 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" Mar 19 11:52:49.819517 master-0 kubenswrapper[4029]: I0319 11:52:49.819474 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 11:52:49.823056 master-0 kubenswrapper[4029]: I0319 11:52:49.823001 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 11:52:49.823422 master-0 kubenswrapper[4029]: I0319 11:52:49.823387 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 11:52:49.823678 master-0 kubenswrapper[4029]: I0319 11:52:49.823648 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 11:52:49.828747 master-0 kubenswrapper[4029]: I0319 11:52:49.828629 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9"] Mar 19 11:52:49.829388 master-0 kubenswrapper[4029]: I0319 11:52:49.829353 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:52:49.834605 master-0 kubenswrapper[4029]: I0319 11:52:49.834563 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-9c5679d8f-965np"] Mar 19 11:52:49.835152 master-0 kubenswrapper[4029]: I0319 11:52:49.835127 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:52:49.837002 master-0 kubenswrapper[4029]: I0319 11:52:49.836955 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 11:52:49.838102 master-0 kubenswrapper[4029]: I0319 11:52:49.838067 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq"] Mar 19 11:52:49.843944 master-0 kubenswrapper[4029]: I0319 11:52:49.843910 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:52:49.847162 master-0 kubenswrapper[4029]: I0319 11:52:49.846232 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9"] Mar 19 11:52:49.847939 master-0 kubenswrapper[4029]: I0319 11:52:49.847693 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:52:49.848001 master-0 kubenswrapper[4029]: I0319 11:52:49.847946 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 11:52:49.850546 master-0 kubenswrapper[4029]: I0319 11:52:49.850453 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94"] Mar 19 11:52:49.869562 master-0 kubenswrapper[4029]: I0319 11:52:49.868798 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:49.870797 master-0 kubenswrapper[4029]: I0319 11:52:49.870002 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-fx8ng"] Mar 19 11:52:49.870797 master-0 kubenswrapper[4029]: I0319 11:52:49.870176 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-fx8ng" Mar 19 11:52:49.876091 master-0 kubenswrapper[4029]: I0319 11:52:49.872907 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 11:52:49.876091 master-0 kubenswrapper[4029]: I0319 11:52:49.872968 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk"] Mar 19 11:52:49.876091 master-0 kubenswrapper[4029]: I0319 11:52:49.873124 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 11:52:49.876091 master-0 kubenswrapper[4029]: I0319 11:52:49.873219 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 11:52:49.876091 master-0 kubenswrapper[4029]: I0319 11:52:49.873932 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" Mar 19 11:52:49.876091 master-0 kubenswrapper[4029]: I0319 11:52:49.875262 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-jq5vq\" (UID: \"e5078f17-bc65-460f-9f18-8c506db6840b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:52:49.876091 master-0 kubenswrapper[4029]: I0319 11:52:49.875313 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5rm4\" (UniqueName: \"kubernetes.io/projected/e5078f17-bc65-460f-9f18-8c506db6840b-kube-api-access-s5rm4\") pod \"package-server-manager-7b95f86987-jq5vq\" (UID: \"e5078f17-bc65-460f-9f18-8c506db6840b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:52:49.876091 master-0 kubenswrapper[4029]: I0319 11:52:49.875359 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:49.876091 master-0 kubenswrapper[4029]: I0319 11:52:49.875411 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert\") pod \"olm-operator-5c9796789-l9sw9\" (UID: \"716c2176-50f9-4c4f-af0e-4c7973457df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:52:49.876091 master-0 kubenswrapper[4029]: I0319 11:52:49.875448 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8bmw\" (UniqueName: \"kubernetes.io/projected/716c2176-50f9-4c4f-af0e-4c7973457df2-kube-api-access-m8bmw\") pod \"olm-operator-5c9796789-l9sw9\" (UID: \"716c2176-50f9-4c4f-af0e-4c7973457df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:52:49.876091 master-0 kubenswrapper[4029]: I0319 11:52:49.875480 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/732989c5-1b89-46f0-9917-b68613f7f005-serving-cert\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:49.876091 master-0 kubenswrapper[4029]: I0319 11:52:49.875521 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-config\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:49.876091 master-0 kubenswrapper[4029]: I0319 11:52:49.875578 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkm97\" (UniqueName: \"kubernetes.io/projected/cf08ab4f-c203-4c16-9826-8cc049f4af31-kube-api-access-lkm97\") pod \"catalog-operator-68f85b4d6c-n5gr9\" (UID: \"cf08ab4f-c203-4c16-9826-8cc049f4af31\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:52:49.876091 master-0 kubenswrapper[4029]: I0319 11:52:49.875621 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wls49\" (UniqueName: \"kubernetes.io/projected/22e10648-af7c-409e-b947-570e7d807e05-kube-api-access-wls49\") pod \"dns-operator-9c5679d8f-965np\" (UID: \"22e10648-af7c-409e-b947-570e7d807e05\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:52:49.876091 master-0 kubenswrapper[4029]: I0319 11:52:49.875672 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v27lg\" (UniqueName: \"kubernetes.io/projected/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-kube-api-access-v27lg\") pod \"service-ca-operator-b865698dc-md7m5\" (UID: \"f5d73fef-1414-4b29-97ea-42e1c0b1ef18\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" Mar 19 11:52:49.876091 master-0 kubenswrapper[4029]: I0319 11:52:49.875704 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rt57\" (UniqueName: \"kubernetes.io/projected/2292109e-92a9-4286-858e-dcd2ac083c43-kube-api-access-8rt57\") pod \"csi-snapshot-controller-operator-5f5d689c6b-fx8ng\" (UID: \"2292109e-92a9-4286-858e-dcd2ac083c43\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-fx8ng" Mar 19 11:52:49.876091 master-0 kubenswrapper[4029]: I0319 11:52:49.875748 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 11:52:49.876091 master-0 kubenswrapper[4029]: I0319 11:52:49.875760 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 11:52:49.876091 master-0 kubenswrapper[4029]: I0319 11:52:49.875755 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-config\") pod \"service-ca-operator-b865698dc-md7m5\" (UID: \"f5d73fef-1414-4b29-97ea-42e1c0b1ef18\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" Mar 19 11:52:49.876797 master-0 kubenswrapper[4029]: I0319 11:52:49.875901 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 11:52:49.876797 master-0 kubenswrapper[4029]: I0319 11:52:49.875904 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:49.876797 master-0 kubenswrapper[4029]: I0319 11:52:49.876040 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 11:52:49.876797 master-0 kubenswrapper[4029]: I0319 11:52:49.876052 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls\") pod \"dns-operator-9c5679d8f-965np\" (UID: \"22e10648-af7c-409e-b947-570e7d807e05\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:52:49.876797 master-0 kubenswrapper[4029]: I0319 11:52:49.876083 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-serving-cert\") pod \"service-ca-operator-b865698dc-md7m5\" (UID: \"f5d73fef-1414-4b29-97ea-42e1c0b1ef18\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" Mar 19 11:52:49.876797 master-0 kubenswrapper[4029]: I0319 11:52:49.876123 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert\") pod \"catalog-operator-68f85b4d6c-n5gr9\" (UID: \"cf08ab4f-c203-4c16-9826-8cc049f4af31\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:52:49.876797 master-0 kubenswrapper[4029]: I0319 11:52:49.876136 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 19 11:52:49.876797 master-0 kubenswrapper[4029]: I0319 11:52:49.876152 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfvz6\" (UniqueName: \"kubernetes.io/projected/732989c5-1b89-46f0-9917-b68613f7f005-kube-api-access-bfvz6\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:49.876797 master-0 kubenswrapper[4029]: I0319 11:52:49.876238 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 19 11:52:49.876797 master-0 kubenswrapper[4029]: I0319 11:52:49.876324 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 11:52:49.876797 master-0 kubenswrapper[4029]: I0319 11:52:49.876342 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 11:52:49.876797 master-0 kubenswrapper[4029]: I0319 11:52:49.876361 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 11:52:49.876797 master-0 kubenswrapper[4029]: I0319 11:52:49.876449 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 11:52:49.877578 master-0 kubenswrapper[4029]: I0319 11:52:49.877523 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn"] Mar 19 11:52:49.878034 master-0 kubenswrapper[4029]: I0319 11:52:49.878015 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h"] Mar 19 11:52:49.878294 master-0 kubenswrapper[4029]: I0319 11:52:49.878276 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-89ccd998f-bftt4"] Mar 19 11:52:49.878526 master-0 kubenswrapper[4029]: I0319 11:52:49.878503 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" Mar 19 11:52:49.879539 master-0 kubenswrapper[4029]: I0319 11:52:49.878695 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:52:49.879539 master-0 kubenswrapper[4029]: I0319 11:52:49.878720 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm"] Mar 19 11:52:49.879539 master-0 kubenswrapper[4029]: I0319 11:52:49.879176 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" Mar 19 11:52:49.879751 master-0 kubenswrapper[4029]: I0319 11:52:49.879644 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc"] Mar 19 11:52:49.880012 master-0 kubenswrapper[4029]: I0319 11:52:49.879983 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:49.880340 master-0 kubenswrapper[4029]: I0319 11:52:49.880307 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" Mar 19 11:52:49.882420 master-0 kubenswrapper[4029]: I0319 11:52:49.882385 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 11:52:49.887751 master-0 kubenswrapper[4029]: I0319 11:52:49.886968 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 11:52:49.887751 master-0 kubenswrapper[4029]: I0319 11:52:49.887159 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 11:52:49.887751 master-0 kubenswrapper[4029]: I0319 11:52:49.887271 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 19 11:52:49.889812 master-0 kubenswrapper[4029]: I0319 11:52:49.888350 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 11:52:49.889812 master-0 kubenswrapper[4029]: I0319 11:52:49.888535 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 11:52:49.889812 master-0 kubenswrapper[4029]: I0319 11:52:49.888822 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 11:52:49.889812 master-0 kubenswrapper[4029]: I0319 11:52:49.889536 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 19 11:52:49.889812 master-0 kubenswrapper[4029]: I0319 11:52:49.889766 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 11:52:49.890142 master-0 kubenswrapper[4029]: I0319 11:52:49.889972 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 11:52:49.890618 master-0 kubenswrapper[4029]: I0319 11:52:49.890561 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 11:52:49.891020 master-0 kubenswrapper[4029]: I0319 11:52:49.890981 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz"] Mar 19 11:52:49.891679 master-0 kubenswrapper[4029]: I0319 11:52:49.891649 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh"] Mar 19 11:52:49.892130 master-0 kubenswrapper[4029]: I0319 11:52:49.892077 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:52:49.892246 master-0 kubenswrapper[4029]: I0319 11:52:49.892212 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n"] Mar 19 11:52:49.892616 master-0 kubenswrapper[4029]: I0319 11:52:49.892591 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" Mar 19 11:52:49.892782 master-0 kubenswrapper[4029]: I0319 11:52:49.892755 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:52:49.895046 master-0 kubenswrapper[4029]: I0319 11:52:49.894570 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd"] Mar 19 11:52:49.895046 master-0 kubenswrapper[4029]: I0319 11:52:49.894778 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss"] Mar 19 11:52:49.895046 master-0 kubenswrapper[4029]: I0319 11:52:49.895038 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:52:49.895272 master-0 kubenswrapper[4029]: I0319 11:52:49.895223 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:49.908532 master-0 kubenswrapper[4029]: I0319 11:52:49.904370 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 11:52:49.908532 master-0 kubenswrapper[4029]: I0319 11:52:49.904542 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 19 11:52:49.908532 master-0 kubenswrapper[4029]: I0319 11:52:49.904757 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 11:52:49.908532 master-0 kubenswrapper[4029]: I0319 11:52:49.904755 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 11:52:49.908532 master-0 kubenswrapper[4029]: I0319 11:52:49.904883 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 11:52:49.908532 master-0 kubenswrapper[4029]: I0319 11:52:49.907955 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 11:52:49.908532 master-0 kubenswrapper[4029]: I0319 11:52:49.908058 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 19 11:52:49.908532 master-0 kubenswrapper[4029]: I0319 11:52:49.908194 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 11:52:49.908532 master-0 kubenswrapper[4029]: I0319 11:52:49.908201 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 11:52:49.908532 master-0 kubenswrapper[4029]: I0319 11:52:49.908326 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 11:52:49.908532 master-0 kubenswrapper[4029]: I0319 11:52:49.908389 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 11:52:49.908532 master-0 kubenswrapper[4029]: I0319 11:52:49.908424 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 11:52:49.908532 master-0 kubenswrapper[4029]: I0319 11:52:49.908427 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 11:52:49.908532 master-0 kubenswrapper[4029]: I0319 11:52:49.908444 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 11:52:49.908532 master-0 kubenswrapper[4029]: I0319 11:52:49.908452 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 19 11:52:49.908532 master-0 kubenswrapper[4029]: I0319 11:52:49.908481 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 11:52:49.908532 master-0 kubenswrapper[4029]: I0319 11:52:49.908560 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 11:52:49.910248 master-0 kubenswrapper[4029]: I0319 11:52:49.908760 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 11:52:49.922919 master-0 kubenswrapper[4029]: I0319 11:52:49.922820 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 19 11:52:49.923194 master-0 kubenswrapper[4029]: I0319 11:52:49.923145 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 19 11:52:49.923476 master-0 kubenswrapper[4029]: I0319 11:52:49.923448 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 19 11:52:49.923637 master-0 kubenswrapper[4029]: I0319 11:52:49.923605 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 11:52:49.923968 master-0 kubenswrapper[4029]: I0319 11:52:49.923946 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 19 11:52:49.924159 master-0 kubenswrapper[4029]: I0319 11:52:49.924138 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 11:52:49.924379 master-0 kubenswrapper[4029]: I0319 11:52:49.924351 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 11:52:49.924554 master-0 kubenswrapper[4029]: I0319 11:52:49.924537 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 11:52:49.927800 master-0 kubenswrapper[4029]: I0319 11:52:49.927434 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 11:52:49.931413 master-0 kubenswrapper[4029]: I0319 11:52:49.931358 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 11:52:49.931755 master-0 kubenswrapper[4029]: I0319 11:52:49.931689 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 19 11:52:49.935104 master-0 kubenswrapper[4029]: I0319 11:52:49.935068 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 19 11:52:49.936182 master-0 kubenswrapper[4029]: I0319 11:52:49.936101 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w"] Mar 19 11:52:49.936810 master-0 kubenswrapper[4029]: I0319 11:52:49.936783 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" Mar 19 11:52:49.937231 master-0 kubenswrapper[4029]: I0319 11:52:49.937198 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4"] Mar 19 11:52:49.937755 master-0 kubenswrapper[4029]: I0319 11:52:49.937709 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:49.938865 master-0 kubenswrapper[4029]: I0319 11:52:49.938803 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh"] Mar 19 11:52:49.939459 master-0 kubenswrapper[4029]: I0319 11:52:49.939417 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" Mar 19 11:52:49.940235 master-0 kubenswrapper[4029]: I0319 11:52:49.940197 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 11:52:49.940329 master-0 kubenswrapper[4029]: I0319 11:52:49.940302 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 11:52:49.942110 master-0 kubenswrapper[4029]: I0319 11:52:49.941156 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 19 11:52:49.942110 master-0 kubenswrapper[4029]: I0319 11:52:49.941585 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2"] Mar 19 11:52:49.943008 master-0 kubenswrapper[4029]: I0319 11:52:49.942963 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 11:52:49.943222 master-0 kubenswrapper[4029]: I0319 11:52:49.943180 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:49.943452 master-0 kubenswrapper[4029]: I0319 11:52:49.943429 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 11:52:49.943560 master-0 kubenswrapper[4029]: I0319 11:52:49.943513 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 11:52:49.943696 master-0 kubenswrapper[4029]: I0319 11:52:49.943672 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 11:52:49.943696 master-0 kubenswrapper[4029]: I0319 11:52:49.943687 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 11:52:49.943775 master-0 kubenswrapper[4029]: I0319 11:52:49.943748 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 11:52:49.946289 master-0 kubenswrapper[4029]: I0319 11:52:49.945849 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 11:52:49.946289 master-0 kubenswrapper[4029]: I0319 11:52:49.946101 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5"] Mar 19 11:52:49.947067 master-0 kubenswrapper[4029]: I0319 11:52:49.947030 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94"] Mar 19 11:52:49.947918 master-0 kubenswrapper[4029]: I0319 11:52:49.947874 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-fx8ng"] Mar 19 11:52:49.948081 master-0 kubenswrapper[4029]: I0319 11:52:49.948049 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 11:52:49.948256 master-0 kubenswrapper[4029]: I0319 11:52:49.948227 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 11:52:49.950661 master-0 kubenswrapper[4029]: I0319 11:52:49.950621 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 11:52:49.951046 master-0 kubenswrapper[4029]: I0319 11:52:49.951012 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 11:52:49.951551 master-0 kubenswrapper[4029]: I0319 11:52:49.951515 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9"] Mar 19 11:52:49.952512 master-0 kubenswrapper[4029]: I0319 11:52:49.952460 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq"] Mar 19 11:52:49.953776 master-0 kubenswrapper[4029]: I0319 11:52:49.953714 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 11:52:49.953878 master-0 kubenswrapper[4029]: I0319 11:52:49.953843 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm"] Mar 19 11:52:49.954864 master-0 kubenswrapper[4029]: I0319 11:52:49.954838 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 11:52:49.954936 master-0 kubenswrapper[4029]: I0319 11:52:49.954877 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9"] Mar 19 11:52:49.955537 master-0 kubenswrapper[4029]: I0319 11:52:49.955497 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-9c5679d8f-965np"] Mar 19 11:52:49.957400 master-0 kubenswrapper[4029]: I0319 11:52:49.956669 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-89ccd998f-bftt4"] Mar 19 11:52:49.957549 master-0 kubenswrapper[4029]: I0319 11:52:49.957519 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh"] Mar 19 11:52:49.958545 master-0 kubenswrapper[4029]: I0319 11:52:49.958500 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd"] Mar 19 11:52:49.960275 master-0 kubenswrapper[4029]: I0319 11:52:49.960228 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz"] Mar 19 11:52:49.961608 master-0 kubenswrapper[4029]: I0319 11:52:49.961558 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss"] Mar 19 11:52:49.962819 master-0 kubenswrapper[4029]: I0319 11:52:49.962779 4029 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-n52gc"] Mar 19 11:52:49.963360 master-0 kubenswrapper[4029]: I0319 11:52:49.963324 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-n52gc" Mar 19 11:52:49.963612 master-0 kubenswrapper[4029]: I0319 11:52:49.963573 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4"] Mar 19 11:52:49.965385 master-0 kubenswrapper[4029]: I0319 11:52:49.964753 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h"] Mar 19 11:52:49.966089 master-0 kubenswrapper[4029]: I0319 11:52:49.966048 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 11:52:49.966389 master-0 kubenswrapper[4029]: I0319 11:52:49.966361 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn"] Mar 19 11:52:49.967356 master-0 kubenswrapper[4029]: I0319 11:52:49.967004 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh"] Mar 19 11:52:49.968465 master-0 kubenswrapper[4029]: I0319 11:52:49.968406 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc"] Mar 19 11:52:49.971427 master-0 kubenswrapper[4029]: I0319 11:52:49.971385 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w"] Mar 19 11:52:49.972427 master-0 kubenswrapper[4029]: I0319 11:52:49.972375 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n"] Mar 19 11:52:49.981623 master-0 kubenswrapper[4029]: I0319 11:52:49.981555 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:49.981859 master-0 kubenswrapper[4029]: I0319 11:52:49.981655 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-jq5vq\" (UID: \"e5078f17-bc65-460f-9f18-8c506db6840b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:52:49.981859 master-0 kubenswrapper[4029]: I0319 11:52:49.981701 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5rm4\" (UniqueName: \"kubernetes.io/projected/e5078f17-bc65-460f-9f18-8c506db6840b-kube-api-access-s5rm4\") pod \"package-server-manager-7b95f86987-jq5vq\" (UID: \"e5078f17-bc65-460f-9f18-8c506db6840b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:52:49.981859 master-0 kubenswrapper[4029]: I0319 11:52:49.981684 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk"] Mar 19 11:52:49.981859 master-0 kubenswrapper[4029]: I0319 11:52:49.981760 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert\") pod \"olm-operator-5c9796789-l9sw9\" (UID: \"716c2176-50f9-4c4f-af0e-4c7973457df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:52:49.981859 master-0 kubenswrapper[4029]: I0319 11:52:49.981836 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8bmw\" (UniqueName: \"kubernetes.io/projected/716c2176-50f9-4c4f-af0e-4c7973457df2-kube-api-access-m8bmw\") pod \"olm-operator-5c9796789-l9sw9\" (UID: \"716c2176-50f9-4c4f-af0e-4c7973457df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:52:49.981859 master-0 kubenswrapper[4029]: I0319 11:52:49.981868 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/732989c5-1b89-46f0-9917-b68613f7f005-serving-cert\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:49.982163 master-0 kubenswrapper[4029]: I0319 11:52:49.981906 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-config\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:49.982163 master-0 kubenswrapper[4029]: I0319 11:52:49.981933 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkm97\" (UniqueName: \"kubernetes.io/projected/cf08ab4f-c203-4c16-9826-8cc049f4af31-kube-api-access-lkm97\") pod \"catalog-operator-68f85b4d6c-n5gr9\" (UID: \"cf08ab4f-c203-4c16-9826-8cc049f4af31\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:52:49.983293 master-0 kubenswrapper[4029]: I0319 11:52:49.983236 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-config\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:49.987493 master-0 kubenswrapper[4029]: I0319 11:52:49.983129 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wls49\" (UniqueName: \"kubernetes.io/projected/22e10648-af7c-409e-b947-570e7d807e05-kube-api-access-wls49\") pod \"dns-operator-9c5679d8f-965np\" (UID: \"22e10648-af7c-409e-b947-570e7d807e05\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:52:49.987493 master-0 kubenswrapper[4029]: I0319 11:52:49.983795 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v27lg\" (UniqueName: \"kubernetes.io/projected/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-kube-api-access-v27lg\") pod \"service-ca-operator-b865698dc-md7m5\" (UID: \"f5d73fef-1414-4b29-97ea-42e1c0b1ef18\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" Mar 19 11:52:49.987493 master-0 kubenswrapper[4029]: I0319 11:52:49.983856 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rt57\" (UniqueName: \"kubernetes.io/projected/2292109e-92a9-4286-858e-dcd2ac083c43-kube-api-access-8rt57\") pod \"csi-snapshot-controller-operator-5f5d689c6b-fx8ng\" (UID: \"2292109e-92a9-4286-858e-dcd2ac083c43\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-fx8ng" Mar 19 11:52:49.987493 master-0 kubenswrapper[4029]: I0319 11:52:49.983900 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-config\") pod \"service-ca-operator-b865698dc-md7m5\" (UID: \"f5d73fef-1414-4b29-97ea-42e1c0b1ef18\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" Mar 19 11:52:49.987493 master-0 kubenswrapper[4029]: I0319 11:52:49.983999 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:49.987493 master-0 kubenswrapper[4029]: I0319 11:52:49.984042 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls\") pod \"dns-operator-9c5679d8f-965np\" (UID: \"22e10648-af7c-409e-b947-570e7d807e05\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:52:49.987493 master-0 kubenswrapper[4029]: E0319 11:52:49.984096 4029 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 11:52:49.987493 master-0 kubenswrapper[4029]: E0319 11:52:49.984722 4029 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:52:49.987493 master-0 kubenswrapper[4029]: I0319 11:52:49.984822 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:49.987493 master-0 kubenswrapper[4029]: E0319 11:52:49.985319 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert podName:716c2176-50f9-4c4f-af0e-4c7973457df2 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:50.484180877 +0000 UTC m=+115.561057444 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert") pod "olm-operator-5c9796789-l9sw9" (UID: "716c2176-50f9-4c4f-af0e-4c7973457df2") : secret "olm-operator-serving-cert" not found Mar 19 11:52:49.987493 master-0 kubenswrapper[4029]: I0319 11:52:49.985979 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2"] Mar 19 11:52:49.987493 master-0 kubenswrapper[4029]: I0319 11:52:49.985961 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-serving-cert\") pod \"service-ca-operator-b865698dc-md7m5\" (UID: \"f5d73fef-1414-4b29-97ea-42e1c0b1ef18\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" Mar 19 11:52:49.987493 master-0 kubenswrapper[4029]: E0319 11:52:49.986045 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls podName:22e10648-af7c-409e-b947-570e7d807e05 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:50.485996949 +0000 UTC m=+115.562873516 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls") pod "dns-operator-9c5679d8f-965np" (UID: "22e10648-af7c-409e-b947-570e7d807e05") : secret "metrics-tls" not found Mar 19 11:52:49.987493 master-0 kubenswrapper[4029]: I0319 11:52:49.985385 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-config\") pod \"service-ca-operator-b865698dc-md7m5\" (UID: \"f5d73fef-1414-4b29-97ea-42e1c0b1ef18\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" Mar 19 11:52:49.987493 master-0 kubenswrapper[4029]: E0319 11:52:49.986222 4029 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 11:52:49.987493 master-0 kubenswrapper[4029]: I0319 11:52:49.986459 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:49.997969 master-0 kubenswrapper[4029]: E0319 11:52:49.986633 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert podName:e5078f17-bc65-460f-9f18-8c506db6840b nodeName:}" failed. No retries permitted until 2026-03-19 11:52:50.486601583 +0000 UTC m=+115.563478160 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-jq5vq" (UID: "e5078f17-bc65-460f-9f18-8c506db6840b") : secret "package-server-manager-serving-cert" not found Mar 19 11:52:49.997969 master-0 kubenswrapper[4029]: I0319 11:52:49.986867 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert\") pod \"catalog-operator-68f85b4d6c-n5gr9\" (UID: \"cf08ab4f-c203-4c16-9826-8cc049f4af31\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:52:49.997969 master-0 kubenswrapper[4029]: I0319 11:52:49.986933 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfvz6\" (UniqueName: \"kubernetes.io/projected/732989c5-1b89-46f0-9917-b68613f7f005-kube-api-access-bfvz6\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:49.997969 master-0 kubenswrapper[4029]: E0319 11:52:49.987023 4029 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 11:52:49.997969 master-0 kubenswrapper[4029]: E0319 11:52:49.987142 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert podName:cf08ab4f-c203-4c16-9826-8cc049f4af31 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:50.487101275 +0000 UTC m=+115.563977842 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert") pod "catalog-operator-68f85b4d6c-n5gr9" (UID: "cf08ab4f-c203-4c16-9826-8cc049f4af31") : secret "catalog-operator-serving-cert" not found Mar 19 11:52:50.003795 master-0 kubenswrapper[4029]: I0319 11:52:50.003447 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-serving-cert\") pod \"service-ca-operator-b865698dc-md7m5\" (UID: \"f5d73fef-1414-4b29-97ea-42e1c0b1ef18\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" Mar 19 11:52:50.003795 master-0 kubenswrapper[4029]: I0319 11:52:50.003466 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/732989c5-1b89-46f0-9917-b68613f7f005-serving-cert\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:50.007786 master-0 kubenswrapper[4029]: I0319 11:52:50.007369 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkm97\" (UniqueName: \"kubernetes.io/projected/cf08ab4f-c203-4c16-9826-8cc049f4af31-kube-api-access-lkm97\") pod \"catalog-operator-68f85b4d6c-n5gr9\" (UID: \"cf08ab4f-c203-4c16-9826-8cc049f4af31\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:52:50.008605 master-0 kubenswrapper[4029]: I0319 11:52:50.008512 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5rm4\" (UniqueName: \"kubernetes.io/projected/e5078f17-bc65-460f-9f18-8c506db6840b-kube-api-access-s5rm4\") pod \"package-server-manager-7b95f86987-jq5vq\" (UID: \"e5078f17-bc65-460f-9f18-8c506db6840b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:52:50.009287 master-0 kubenswrapper[4029]: I0319 11:52:50.008889 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8bmw\" (UniqueName: \"kubernetes.io/projected/716c2176-50f9-4c4f-af0e-4c7973457df2-kube-api-access-m8bmw\") pod \"olm-operator-5c9796789-l9sw9\" (UID: \"716c2176-50f9-4c4f-af0e-4c7973457df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:52:50.009287 master-0 kubenswrapper[4029]: I0319 11:52:50.009241 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfvz6\" (UniqueName: \"kubernetes.io/projected/732989c5-1b89-46f0-9917-b68613f7f005-kube-api-access-bfvz6\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:50.009643 master-0 kubenswrapper[4029]: I0319 11:52:50.009516 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rt57\" (UniqueName: \"kubernetes.io/projected/2292109e-92a9-4286-858e-dcd2ac083c43-kube-api-access-8rt57\") pod \"csi-snapshot-controller-operator-5f5d689c6b-fx8ng\" (UID: \"2292109e-92a9-4286-858e-dcd2ac083c43\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-fx8ng" Mar 19 11:52:50.010165 master-0 kubenswrapper[4029]: I0319 11:52:50.010088 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v27lg\" (UniqueName: \"kubernetes.io/projected/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-kube-api-access-v27lg\") pod \"service-ca-operator-b865698dc-md7m5\" (UID: \"f5d73fef-1414-4b29-97ea-42e1c0b1ef18\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" Mar 19 11:52:50.012203 master-0 kubenswrapper[4029]: I0319 11:52:50.012164 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wls49\" (UniqueName: \"kubernetes.io/projected/22e10648-af7c-409e-b947-570e7d807e05-kube-api-access-wls49\") pod \"dns-operator-9c5679d8f-965np\" (UID: \"22e10648-af7c-409e-b947-570e7d807e05\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:52:50.087235 master-0 kubenswrapper[4029]: I0319 11:52:50.087160 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-wdwkz\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:52:50.087361 master-0 kubenswrapper[4029]: I0319 11:52:50.087318 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-config\") pod \"openshift-kube-scheduler-operator-dddff6458-4wj9n\" (UID: \"9b61ea14-a7ea-49f3-9df4-5655765ddf7c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" Mar 19 11:52:50.087432 master-0 kubenswrapper[4029]: I0319 11:52:50.087376 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrgqb\" (UniqueName: \"kubernetes.io/projected/a3ceeece-bee9-4fcb-8517-95ebce38e223-kube-api-access-zrgqb\") pod \"openshift-config-operator-95bf4f4d-ng9ss\" (UID: \"a3ceeece-bee9-4fcb-8517-95ebce38e223\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:52:50.087548 master-0 kubenswrapper[4029]: I0319 11:52:50.087496 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66f88242-8b0b-4790-bbb6-445c19b34ee7-config\") pod \"openshift-apiserver-operator-d65958b8-6hsqn\" (UID: \"66f88242-8b0b-4790-bbb6-445c19b34ee7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" Mar 19 11:52:50.087548 master-0 kubenswrapper[4029]: I0319 11:52:50.087532 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnl28\" (UniqueName: \"kubernetes.io/projected/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-kube-api-access-dnl28\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:50.087696 master-0 kubenswrapper[4029]: I0319 11:52:50.087557 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a3ceeece-bee9-4fcb-8517-95ebce38e223-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-ng9ss\" (UID: \"a3ceeece-bee9-4fcb-8517-95ebce38e223\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:52:50.087696 master-0 kubenswrapper[4029]: I0319 11:52:50.087592 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:52:50.087696 master-0 kubenswrapper[4029]: I0319 11:52:50.087632 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w\" (UID: \"b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" Mar 19 11:52:50.087696 master-0 kubenswrapper[4029]: I0319 11:52:50.087661 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-4wj9n\" (UID: \"9b61ea14-a7ea-49f3-9df4-5655765ddf7c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" Mar 19 11:52:50.087696 master-0 kubenswrapper[4029]: I0319 11:52:50.087685 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39d3ac31-9259-454b-8e1c-e23024f8f2b2-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-pjc7h\" (UID: \"39d3ac31-9259-454b-8e1c-e23024f8f2b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" Mar 19 11:52:50.087696 master-0 kubenswrapper[4029]: I0319 11:52:50.087710 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-4wj9n\" (UID: \"9b61ea14-a7ea-49f3-9df4-5655765ddf7c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" Mar 19 11:52:50.087965 master-0 kubenswrapper[4029]: I0319 11:52:50.087830 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7spvn\" (UniqueName: \"kubernetes.io/projected/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-kube-api-access-7spvn\") pod \"multus-admission-controller-5dbbb8b86f-wdwkz\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:52:50.087965 master-0 kubenswrapper[4029]: I0319 11:52:50.087920 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:50.087965 master-0 kubenswrapper[4029]: I0319 11:52:50.087952 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5fnx\" (UniqueName: \"kubernetes.io/projected/66f88242-8b0b-4790-bbb6-445c19b34ee7-kube-api-access-p5fnx\") pod \"openshift-apiserver-operator-d65958b8-6hsqn\" (UID: \"66f88242-8b0b-4790-bbb6-445c19b34ee7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" Mar 19 11:52:50.088070 master-0 kubenswrapper[4029]: I0319 11:52:50.088006 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nds54\" (UniqueName: \"kubernetes.io/projected/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-kube-api-access-nds54\") pod \"openshift-controller-manager-operator-8c94f4649-6ghdm\" (UID: \"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" Mar 19 11:52:50.088070 master-0 kubenswrapper[4029]: I0319 11:52:50.088052 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79qrb\" (UniqueName: \"kubernetes.io/projected/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-kube-api-access-79qrb\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:52:50.088133 master-0 kubenswrapper[4029]: I0319 11:52:50.088079 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:50.088133 master-0 kubenswrapper[4029]: I0319 11:52:50.088104 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39d3ac31-9259-454b-8e1c-e23024f8f2b2-config\") pod \"kube-apiserver-operator-8b68b9d9b-pjc7h\" (UID: \"39d3ac31-9259-454b-8e1c-e23024f8f2b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" Mar 19 11:52:50.088133 master-0 kubenswrapper[4029]: I0319 11:52:50.088124 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46m89\" (UniqueName: \"kubernetes.io/projected/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-kube-api-access-46m89\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:50.088247 master-0 kubenswrapper[4029]: I0319 11:52:50.088145 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3ceeece-bee9-4fcb-8517-95ebce38e223-serving-cert\") pod \"openshift-config-operator-95bf4f4d-ng9ss\" (UID: \"a3ceeece-bee9-4fcb-8517-95ebce38e223\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:52:50.088247 master-0 kubenswrapper[4029]: I0319 11:52:50.088194 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/163d6a3d-0080-4122-bb7a-17f6e63f00f0-bound-sa-token\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:50.088308 master-0 kubenswrapper[4029]: I0319 11:52:50.088252 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/39d3ac31-9259-454b-8e1c-e23024f8f2b2-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-pjc7h\" (UID: \"39d3ac31-9259-454b-8e1c-e23024f8f2b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" Mar 19 11:52:50.088343 master-0 kubenswrapper[4029]: I0319 11:52:50.088307 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:50.088374 master-0 kubenswrapper[4029]: I0319 11:52:50.088350 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66f88242-8b0b-4790-bbb6-445c19b34ee7-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-6hsqn\" (UID: \"66f88242-8b0b-4790-bbb6-445c19b34ee7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" Mar 19 11:52:50.088403 master-0 kubenswrapper[4029]: I0319 11:52:50.088383 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nfnb\" (UniqueName: \"kubernetes.io/projected/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-kube-api-access-7nfnb\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:50.088474 master-0 kubenswrapper[4029]: I0319 11:52:50.088431 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:50.088516 master-0 kubenswrapper[4029]: I0319 11:52:50.088480 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:50.088516 master-0 kubenswrapper[4029]: I0319 11:52:50.088505 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7tc5\" (UniqueName: \"kubernetes.io/projected/163d6a3d-0080-4122-bb7a-17f6e63f00f0-kube-api-access-m7tc5\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:50.088608 master-0 kubenswrapper[4029]: I0319 11:52:50.088526 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlr9q\" (UniqueName: \"kubernetes.io/projected/b3de8a1b-a5be-414f-86e8-738e16c8bc97-kube-api-access-nlr9q\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:52:50.088608 master-0 kubenswrapper[4029]: I0319 11:52:50.088580 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:50.088608 master-0 kubenswrapper[4029]: I0319 11:52:50.088600 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbcbba74-ac53-4724-a217-4d9b85e7c1db-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-5gvgh\" (UID: \"dbcbba74-ac53-4724-a217-4d9b85e7c1db\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" Mar 19 11:52:50.088687 master-0 kubenswrapper[4029]: I0319 11:52:50.088616 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w\" (UID: \"b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" Mar 19 11:52:50.088687 master-0 kubenswrapper[4029]: I0319 11:52:50.088638 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgz7q\" (UniqueName: \"kubernetes.io/projected/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-kube-api-access-kgz7q\") pod \"iptables-alerter-n52gc\" (UID: \"4e2c195f-e97d-4cac-81fc-2d5c551d1c30\") " pod="openshift-network-operator/iptables-alerter-n52gc" Mar 19 11:52:50.088687 master-0 kubenswrapper[4029]: I0319 11:52:50.088657 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/163d6a3d-0080-4122-bb7a-17f6e63f00f0-trusted-ca\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:50.088687 master-0 kubenswrapper[4029]: I0319 11:52:50.088680 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/6611e325-6152-480c-9c2c-1b503e49ccd2-operand-assets\") pod \"cluster-olm-operator-67dcd4998-rgbzk\" (UID: \"6611e325-6152-480c-9c2c-1b503e49ccd2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" Mar 19 11:52:50.088825 master-0 kubenswrapper[4029]: I0319 11:52:50.088715 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:52:50.088825 master-0 kubenswrapper[4029]: I0319 11:52:50.088748 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbcbba74-ac53-4724-a217-4d9b85e7c1db-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-5gvgh\" (UID: \"dbcbba74-ac53-4724-a217-4d9b85e7c1db\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" Mar 19 11:52:50.088825 master-0 kubenswrapper[4029]: I0319 11:52:50.088763 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-serving-cert\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:50.088825 master-0 kubenswrapper[4029]: I0319 11:52:50.088780 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-ca\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:50.088825 master-0 kubenswrapper[4029]: I0319 11:52:50.088805 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:52:50.088825 master-0 kubenswrapper[4029]: I0319 11:52:50.088826 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/6611e325-6152-480c-9c2c-1b503e49ccd2-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-rgbzk\" (UID: \"6611e325-6152-480c-9c2c-1b503e49ccd2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" Mar 19 11:52:50.089004 master-0 kubenswrapper[4029]: I0319 11:52:50.088848 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qql5t\" (UniqueName: \"kubernetes.io/projected/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-kube-api-access-qql5t\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w\" (UID: \"b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" Mar 19 11:52:50.089004 master-0 kubenswrapper[4029]: I0319 11:52:50.088879 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-config\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:50.089004 master-0 kubenswrapper[4029]: I0319 11:52:50.088896 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:52:50.089004 master-0 kubenswrapper[4029]: I0319 11:52:50.088962 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbcbba74-ac53-4724-a217-4d9b85e7c1db-config\") pod \"kube-controller-manager-operator-ff989d6cc-5gvgh\" (UID: \"dbcbba74-ac53-4724-a217-4d9b85e7c1db\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" Mar 19 11:52:50.089137 master-0 kubenswrapper[4029]: I0319 11:52:50.089026 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-6ghdm\" (UID: \"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" Mar 19 11:52:50.089137 master-0 kubenswrapper[4029]: I0319 11:52:50.089069 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-client\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:50.089137 master-0 kubenswrapper[4029]: I0319 11:52:50.089101 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-iptables-alerter-script\") pod \"iptables-alerter-n52gc\" (UID: \"4e2c195f-e97d-4cac-81fc-2d5c551d1c30\") " pod="openshift-network-operator/iptables-alerter-n52gc" Mar 19 11:52:50.089137 master-0 kubenswrapper[4029]: I0319 11:52:50.089138 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-config\") pod \"openshift-controller-manager-operator-8c94f4649-6ghdm\" (UID: \"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" Mar 19 11:52:50.089287 master-0 kubenswrapper[4029]: I0319 11:52:50.089189 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p4hg\" (UniqueName: \"kubernetes.io/projected/6611e325-6152-480c-9c2c-1b503e49ccd2-kube-api-access-4p4hg\") pod \"cluster-olm-operator-67dcd4998-rgbzk\" (UID: \"6611e325-6152-480c-9c2c-1b503e49ccd2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" Mar 19 11:52:50.089287 master-0 kubenswrapper[4029]: I0319 11:52:50.089231 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-host-slash\") pod \"iptables-alerter-n52gc\" (UID: \"4e2c195f-e97d-4cac-81fc-2d5c551d1c30\") " pod="openshift-network-operator/iptables-alerter-n52gc" Mar 19 11:52:50.089287 master-0 kubenswrapper[4029]: I0319 11:52:50.089260 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:50.089287 master-0 kubenswrapper[4029]: I0319 11:52:50.089288 4029 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:50.132076 master-0 kubenswrapper[4029]: I0319 11:52:50.132019 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" Mar 19 11:52:50.190132 master-0 kubenswrapper[4029]: I0319 11:52:50.189892 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbcbba74-ac53-4724-a217-4d9b85e7c1db-config\") pod \"kube-controller-manager-operator-ff989d6cc-5gvgh\" (UID: \"dbcbba74-ac53-4724-a217-4d9b85e7c1db\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" Mar 19 11:52:50.190132 master-0 kubenswrapper[4029]: I0319 11:52:50.189948 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-config\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:50.190132 master-0 kubenswrapper[4029]: I0319 11:52:50.189966 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:52:50.190132 master-0 kubenswrapper[4029]: I0319 11:52:50.189986 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-iptables-alerter-script\") pod \"iptables-alerter-n52gc\" (UID: \"4e2c195f-e97d-4cac-81fc-2d5c551d1c30\") " pod="openshift-network-operator/iptables-alerter-n52gc" Mar 19 11:52:50.190132 master-0 kubenswrapper[4029]: I0319 11:52:50.190019 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-6ghdm\" (UID: \"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" Mar 19 11:52:50.190132 master-0 kubenswrapper[4029]: I0319 11:52:50.190040 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-client\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:50.190132 master-0 kubenswrapper[4029]: I0319 11:52:50.190056 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-config\") pod \"openshift-controller-manager-operator-8c94f4649-6ghdm\" (UID: \"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" Mar 19 11:52:50.190132 master-0 kubenswrapper[4029]: I0319 11:52:50.190077 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-host-slash\") pod \"iptables-alerter-n52gc\" (UID: \"4e2c195f-e97d-4cac-81fc-2d5c551d1c30\") " pod="openshift-network-operator/iptables-alerter-n52gc" Mar 19 11:52:50.190132 master-0 kubenswrapper[4029]: I0319 11:52:50.190095 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p4hg\" (UniqueName: \"kubernetes.io/projected/6611e325-6152-480c-9c2c-1b503e49ccd2-kube-api-access-4p4hg\") pod \"cluster-olm-operator-67dcd4998-rgbzk\" (UID: \"6611e325-6152-480c-9c2c-1b503e49ccd2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" Mar 19 11:52:50.190132 master-0 kubenswrapper[4029]: I0319 11:52:50.190113 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:50.190132 master-0 kubenswrapper[4029]: I0319 11:52:50.190139 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:50.190132 master-0 kubenswrapper[4029]: I0319 11:52:50.190158 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-wdwkz\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:52:50.190670 master-0 kubenswrapper[4029]: I0319 11:52:50.190175 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-config\") pod \"openshift-kube-scheduler-operator-dddff6458-4wj9n\" (UID: \"9b61ea14-a7ea-49f3-9df4-5655765ddf7c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" Mar 19 11:52:50.190670 master-0 kubenswrapper[4029]: I0319 11:52:50.190192 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrgqb\" (UniqueName: \"kubernetes.io/projected/a3ceeece-bee9-4fcb-8517-95ebce38e223-kube-api-access-zrgqb\") pod \"openshift-config-operator-95bf4f4d-ng9ss\" (UID: \"a3ceeece-bee9-4fcb-8517-95ebce38e223\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:52:50.190670 master-0 kubenswrapper[4029]: I0319 11:52:50.190214 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66f88242-8b0b-4790-bbb6-445c19b34ee7-config\") pod \"openshift-apiserver-operator-d65958b8-6hsqn\" (UID: \"66f88242-8b0b-4790-bbb6-445c19b34ee7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" Mar 19 11:52:50.190670 master-0 kubenswrapper[4029]: I0319 11:52:50.190233 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnl28\" (UniqueName: \"kubernetes.io/projected/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-kube-api-access-dnl28\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:50.190670 master-0 kubenswrapper[4029]: I0319 11:52:50.190251 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a3ceeece-bee9-4fcb-8517-95ebce38e223-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-ng9ss\" (UID: \"a3ceeece-bee9-4fcb-8517-95ebce38e223\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:52:50.190670 master-0 kubenswrapper[4029]: I0319 11:52:50.190268 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-4wj9n\" (UID: \"9b61ea14-a7ea-49f3-9df4-5655765ddf7c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" Mar 19 11:52:50.190670 master-0 kubenswrapper[4029]: I0319 11:52:50.190289 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:52:50.190670 master-0 kubenswrapper[4029]: I0319 11:52:50.190308 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w\" (UID: \"b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" Mar 19 11:52:50.190670 master-0 kubenswrapper[4029]: I0319 11:52:50.190325 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39d3ac31-9259-454b-8e1c-e23024f8f2b2-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-pjc7h\" (UID: \"39d3ac31-9259-454b-8e1c-e23024f8f2b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" Mar 19 11:52:50.190670 master-0 kubenswrapper[4029]: I0319 11:52:50.190342 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-4wj9n\" (UID: \"9b61ea14-a7ea-49f3-9df4-5655765ddf7c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" Mar 19 11:52:50.190670 master-0 kubenswrapper[4029]: I0319 11:52:50.190373 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7spvn\" (UniqueName: \"kubernetes.io/projected/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-kube-api-access-7spvn\") pod \"multus-admission-controller-5dbbb8b86f-wdwkz\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:52:50.190670 master-0 kubenswrapper[4029]: I0319 11:52:50.190415 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:50.190670 master-0 kubenswrapper[4029]: I0319 11:52:50.190432 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5fnx\" (UniqueName: \"kubernetes.io/projected/66f88242-8b0b-4790-bbb6-445c19b34ee7-kube-api-access-p5fnx\") pod \"openshift-apiserver-operator-d65958b8-6hsqn\" (UID: \"66f88242-8b0b-4790-bbb6-445c19b34ee7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" Mar 19 11:52:50.190670 master-0 kubenswrapper[4029]: I0319 11:52:50.190459 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nds54\" (UniqueName: \"kubernetes.io/projected/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-kube-api-access-nds54\") pod \"openshift-controller-manager-operator-8c94f4649-6ghdm\" (UID: \"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" Mar 19 11:52:50.190670 master-0 kubenswrapper[4029]: I0319 11:52:50.190477 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79qrb\" (UniqueName: \"kubernetes.io/projected/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-kube-api-access-79qrb\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:52:50.191077 master-0 kubenswrapper[4029]: I0319 11:52:50.190501 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:50.191077 master-0 kubenswrapper[4029]: I0319 11:52:50.190517 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39d3ac31-9259-454b-8e1c-e23024f8f2b2-config\") pod \"kube-apiserver-operator-8b68b9d9b-pjc7h\" (UID: \"39d3ac31-9259-454b-8e1c-e23024f8f2b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" Mar 19 11:52:50.191077 master-0 kubenswrapper[4029]: I0319 11:52:50.190533 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46m89\" (UniqueName: \"kubernetes.io/projected/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-kube-api-access-46m89\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:50.191077 master-0 kubenswrapper[4029]: I0319 11:52:50.190549 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3ceeece-bee9-4fcb-8517-95ebce38e223-serving-cert\") pod \"openshift-config-operator-95bf4f4d-ng9ss\" (UID: \"a3ceeece-bee9-4fcb-8517-95ebce38e223\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:52:50.191077 master-0 kubenswrapper[4029]: I0319 11:52:50.190564 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/163d6a3d-0080-4122-bb7a-17f6e63f00f0-bound-sa-token\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:50.191077 master-0 kubenswrapper[4029]: I0319 11:52:50.190580 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/39d3ac31-9259-454b-8e1c-e23024f8f2b2-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-pjc7h\" (UID: \"39d3ac31-9259-454b-8e1c-e23024f8f2b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" Mar 19 11:52:50.191077 master-0 kubenswrapper[4029]: I0319 11:52:50.190597 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:50.191077 master-0 kubenswrapper[4029]: I0319 11:52:50.190614 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66f88242-8b0b-4790-bbb6-445c19b34ee7-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-6hsqn\" (UID: \"66f88242-8b0b-4790-bbb6-445c19b34ee7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" Mar 19 11:52:50.191077 master-0 kubenswrapper[4029]: I0319 11:52:50.190631 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nfnb\" (UniqueName: \"kubernetes.io/projected/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-kube-api-access-7nfnb\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:50.191077 master-0 kubenswrapper[4029]: I0319 11:52:50.190649 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:50.191077 master-0 kubenswrapper[4029]: I0319 11:52:50.190668 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:50.191077 master-0 kubenswrapper[4029]: I0319 11:52:50.190686 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7tc5\" (UniqueName: \"kubernetes.io/projected/163d6a3d-0080-4122-bb7a-17f6e63f00f0-kube-api-access-m7tc5\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:50.191077 master-0 kubenswrapper[4029]: I0319 11:52:50.190705 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlr9q\" (UniqueName: \"kubernetes.io/projected/b3de8a1b-a5be-414f-86e8-738e16c8bc97-kube-api-access-nlr9q\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:52:50.191077 master-0 kubenswrapper[4029]: I0319 11:52:50.190739 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:50.191077 master-0 kubenswrapper[4029]: I0319 11:52:50.190762 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgz7q\" (UniqueName: \"kubernetes.io/projected/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-kube-api-access-kgz7q\") pod \"iptables-alerter-n52gc\" (UID: \"4e2c195f-e97d-4cac-81fc-2d5c551d1c30\") " pod="openshift-network-operator/iptables-alerter-n52gc" Mar 19 11:52:50.191560 master-0 kubenswrapper[4029]: I0319 11:52:50.190780 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbcbba74-ac53-4724-a217-4d9b85e7c1db-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-5gvgh\" (UID: \"dbcbba74-ac53-4724-a217-4d9b85e7c1db\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" Mar 19 11:52:50.191560 master-0 kubenswrapper[4029]: I0319 11:52:50.190797 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w\" (UID: \"b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" Mar 19 11:52:50.191560 master-0 kubenswrapper[4029]: I0319 11:52:50.190819 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:52:50.191560 master-0 kubenswrapper[4029]: I0319 11:52:50.190838 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/163d6a3d-0080-4122-bb7a-17f6e63f00f0-trusted-ca\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:50.191560 master-0 kubenswrapper[4029]: I0319 11:52:50.190859 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/6611e325-6152-480c-9c2c-1b503e49ccd2-operand-assets\") pod \"cluster-olm-operator-67dcd4998-rgbzk\" (UID: \"6611e325-6152-480c-9c2c-1b503e49ccd2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" Mar 19 11:52:50.191560 master-0 kubenswrapper[4029]: I0319 11:52:50.190877 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbcbba74-ac53-4724-a217-4d9b85e7c1db-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-5gvgh\" (UID: \"dbcbba74-ac53-4724-a217-4d9b85e7c1db\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" Mar 19 11:52:50.191560 master-0 kubenswrapper[4029]: I0319 11:52:50.190895 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-serving-cert\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:50.191560 master-0 kubenswrapper[4029]: I0319 11:52:50.190912 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-ca\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:50.191560 master-0 kubenswrapper[4029]: I0319 11:52:50.190930 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:52:50.191560 master-0 kubenswrapper[4029]: I0319 11:52:50.190948 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/6611e325-6152-480c-9c2c-1b503e49ccd2-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-rgbzk\" (UID: \"6611e325-6152-480c-9c2c-1b503e49ccd2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" Mar 19 11:52:50.191560 master-0 kubenswrapper[4029]: I0319 11:52:50.190966 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qql5t\" (UniqueName: \"kubernetes.io/projected/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-kube-api-access-qql5t\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w\" (UID: \"b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" Mar 19 11:52:50.192155 master-0 kubenswrapper[4029]: I0319 11:52:50.192130 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbcbba74-ac53-4724-a217-4d9b85e7c1db-config\") pod \"kube-controller-manager-operator-ff989d6cc-5gvgh\" (UID: \"dbcbba74-ac53-4724-a217-4d9b85e7c1db\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" Mar 19 11:52:50.194082 master-0 kubenswrapper[4029]: I0319 11:52:50.193004 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-config\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:50.194082 master-0 kubenswrapper[4029]: I0319 11:52:50.193744 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39d3ac31-9259-454b-8e1c-e23024f8f2b2-config\") pod \"kube-apiserver-operator-8b68b9d9b-pjc7h\" (UID: \"39d3ac31-9259-454b-8e1c-e23024f8f2b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" Mar 19 11:52:50.194082 master-0 kubenswrapper[4029]: I0319 11:52:50.193887 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:52:50.194285 master-0 kubenswrapper[4029]: I0319 11:52:50.194111 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-host-slash\") pod \"iptables-alerter-n52gc\" (UID: \"4e2c195f-e97d-4cac-81fc-2d5c551d1c30\") " pod="openshift-network-operator/iptables-alerter-n52gc" Mar 19 11:52:50.199618 master-0 kubenswrapper[4029]: I0319 11:52:50.194707 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-ca\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:50.199618 master-0 kubenswrapper[4029]: E0319 11:52:50.194866 4029 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 11:52:50.199618 master-0 kubenswrapper[4029]: E0319 11:52:50.194942 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls podName:681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:50.694919149 +0000 UTC m=+115.771795896 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-tkcwh" (UID: "681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9") : secret "cluster-monitoring-operator-tls" not found Mar 19 11:52:50.199618 master-0 kubenswrapper[4029]: I0319 11:52:50.195326 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-iptables-alerter-script\") pod \"iptables-alerter-n52gc\" (UID: \"4e2c195f-e97d-4cac-81fc-2d5c551d1c30\") " pod="openshift-network-operator/iptables-alerter-n52gc" Mar 19 11:52:50.199618 master-0 kubenswrapper[4029]: I0319 11:52:50.195747 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:50.199618 master-0 kubenswrapper[4029]: I0319 11:52:50.195751 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:50.199618 master-0 kubenswrapper[4029]: I0319 11:52:50.195854 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:52:50.199618 master-0 kubenswrapper[4029]: E0319 11:52:50.195905 4029 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 11:52:50.199618 master-0 kubenswrapper[4029]: E0319 11:52:50.195986 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert podName:aaaaf539-bf61-44d7-8d47-97535b7aa1ba nodeName:}" failed. No retries permitted until 2026-03-19 11:52:50.695965804 +0000 UTC m=+115.772842561 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-kb5vd" (UID: "aaaaf539-bf61-44d7-8d47-97535b7aa1ba") : secret "performance-addon-operator-webhook-cert" not found Mar 19 11:52:50.199618 master-0 kubenswrapper[4029]: I0319 11:52:50.196286 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a3ceeece-bee9-4fcb-8517-95ebce38e223-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-ng9ss\" (UID: \"a3ceeece-bee9-4fcb-8517-95ebce38e223\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:52:50.199618 master-0 kubenswrapper[4029]: E0319 11:52:50.196574 4029 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:52:50.199618 master-0 kubenswrapper[4029]: E0319 11:52:50.196714 4029 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 11:52:50.199618 master-0 kubenswrapper[4029]: E0319 11:52:50.197013 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls podName:163d6a3d-0080-4122-bb7a-17f6e63f00f0 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:50.696998268 +0000 UTC m=+115.773875015 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls") pod "ingress-operator-66b84d69b-qrjj4" (UID: "163d6a3d-0080-4122-bb7a-17f6e63f00f0") : secret "metrics-tls" not found Mar 19 11:52:50.199618 master-0 kubenswrapper[4029]: I0319 11:52:50.197141 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w\" (UID: \"b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" Mar 19 11:52:50.199618 master-0 kubenswrapper[4029]: E0319 11:52:50.197234 4029 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 11:52:50.199618 master-0 kubenswrapper[4029]: E0319 11:52:50.197312 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls podName:aaaaf539-bf61-44d7-8d47-97535b7aa1ba nodeName:}" failed. No retries permitted until 2026-03-19 11:52:50.697286464 +0000 UTC m=+115.774163231 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-kb5vd" (UID: "aaaaf539-bf61-44d7-8d47-97535b7aa1ba") : secret "node-tuning-operator-tls" not found Mar 19 11:52:50.201910 master-0 kubenswrapper[4029]: I0319 11:52:50.197677 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-config\") pod \"openshift-kube-scheduler-operator-dddff6458-4wj9n\" (UID: \"9b61ea14-a7ea-49f3-9df4-5655765ddf7c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" Mar 19 11:52:50.201910 master-0 kubenswrapper[4029]: I0319 11:52:50.197743 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/6611e325-6152-480c-9c2c-1b503e49ccd2-operand-assets\") pod \"cluster-olm-operator-67dcd4998-rgbzk\" (UID: \"6611e325-6152-480c-9c2c-1b503e49ccd2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" Mar 19 11:52:50.201910 master-0 kubenswrapper[4029]: E0319 11:52:50.197770 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs podName:89cf2ee8-3664-4502-b70c-b7e0a5e92cb7 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:50.697750745 +0000 UTC m=+115.774627312 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-wdwkz" (UID: "89cf2ee8-3664-4502-b70c-b7e0a5e92cb7") : secret "multus-admission-controller-secret" not found Mar 19 11:52:50.201910 master-0 kubenswrapper[4029]: I0319 11:52:50.198275 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-config\") pod \"openshift-controller-manager-operator-8c94f4649-6ghdm\" (UID: \"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" Mar 19 11:52:50.201910 master-0 kubenswrapper[4029]: I0319 11:52:50.199136 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:50.201910 master-0 kubenswrapper[4029]: I0319 11:52:50.198650 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66f88242-8b0b-4790-bbb6-445c19b34ee7-config\") pod \"openshift-apiserver-operator-d65958b8-6hsqn\" (UID: \"66f88242-8b0b-4790-bbb6-445c19b34ee7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" Mar 19 11:52:50.201910 master-0 kubenswrapper[4029]: E0319 11:52:50.200107 4029 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 11:52:50.201910 master-0 kubenswrapper[4029]: E0319 11:52:50.200130 4029 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 11:52:50.201910 master-0 kubenswrapper[4029]: E0319 11:52:50.200174 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls podName:a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:50.700155201 +0000 UTC m=+115.777031968 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-nrtp2" (UID: "a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1") : secret "image-registry-operator-tls" not found Mar 19 11:52:50.201910 master-0 kubenswrapper[4029]: E0319 11:52:50.200196 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics podName:b3de8a1b-a5be-414f-86e8-738e16c8bc97 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:50.700183522 +0000 UTC m=+115.777060179 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-bftt4" (UID: "b3de8a1b-a5be-414f-86e8-738e16c8bc97") : secret "marketplace-operator-metrics" not found Mar 19 11:52:50.201910 master-0 kubenswrapper[4029]: I0319 11:52:50.200957 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39d3ac31-9259-454b-8e1c-e23024f8f2b2-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-pjc7h\" (UID: \"39d3ac31-9259-454b-8e1c-e23024f8f2b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" Mar 19 11:52:50.201910 master-0 kubenswrapper[4029]: I0319 11:52:50.201022 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3ceeece-bee9-4fcb-8517-95ebce38e223-serving-cert\") pod \"openshift-config-operator-95bf4f4d-ng9ss\" (UID: \"a3ceeece-bee9-4fcb-8517-95ebce38e223\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:52:50.201910 master-0 kubenswrapper[4029]: I0319 11:52:50.201066 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w\" (UID: \"b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" Mar 19 11:52:50.201910 master-0 kubenswrapper[4029]: I0319 11:52:50.201606 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/163d6a3d-0080-4122-bb7a-17f6e63f00f0-trusted-ca\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:50.202387 master-0 kubenswrapper[4029]: I0319 11:52:50.202162 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-6ghdm\" (UID: \"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" Mar 19 11:52:50.202387 master-0 kubenswrapper[4029]: I0319 11:52:50.202208 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-4wj9n\" (UID: \"9b61ea14-a7ea-49f3-9df4-5655765ddf7c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" Mar 19 11:52:50.202387 master-0 kubenswrapper[4029]: I0319 11:52:50.202378 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/6611e325-6152-480c-9c2c-1b503e49ccd2-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-rgbzk\" (UID: \"6611e325-6152-480c-9c2c-1b503e49ccd2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" Mar 19 11:52:50.202840 master-0 kubenswrapper[4029]: I0319 11:52:50.202435 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-serving-cert\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:50.205701 master-0 kubenswrapper[4029]: I0319 11:52:50.205452 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-client\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:50.205701 master-0 kubenswrapper[4029]: I0319 11:52:50.205663 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66f88242-8b0b-4790-bbb6-445c19b34ee7-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-6hsqn\" (UID: \"66f88242-8b0b-4790-bbb6-445c19b34ee7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" Mar 19 11:52:50.210803 master-0 kubenswrapper[4029]: I0319 11:52:50.209129 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbcbba74-ac53-4724-a217-4d9b85e7c1db-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-5gvgh\" (UID: \"dbcbba74-ac53-4724-a217-4d9b85e7c1db\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" Mar 19 11:52:50.217676 master-0 kubenswrapper[4029]: I0319 11:52:50.216301 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qql5t\" (UniqueName: \"kubernetes.io/projected/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-kube-api-access-qql5t\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w\" (UID: \"b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" Mar 19 11:52:50.218375 master-0 kubenswrapper[4029]: I0319 11:52:50.217004 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79qrb\" (UniqueName: \"kubernetes.io/projected/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-kube-api-access-79qrb\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:52:50.220670 master-0 kubenswrapper[4029]: I0319 11:52:50.220638 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:50.246951 master-0 kubenswrapper[4029]: I0319 11:52:50.246398 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:50.261499 master-0 kubenswrapper[4029]: I0319 11:52:50.261281 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbcbba74-ac53-4724-a217-4d9b85e7c1db-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-5gvgh\" (UID: \"dbcbba74-ac53-4724-a217-4d9b85e7c1db\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" Mar 19 11:52:50.263287 master-0 kubenswrapper[4029]: I0319 11:52:50.263089 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-fx8ng" Mar 19 11:52:50.282070 master-0 kubenswrapper[4029]: I0319 11:52:50.276202 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46m89\" (UniqueName: \"kubernetes.io/projected/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-kube-api-access-46m89\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:50.282070 master-0 kubenswrapper[4029]: I0319 11:52:50.280720 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgz7q\" (UniqueName: \"kubernetes.io/projected/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-kube-api-access-kgz7q\") pod \"iptables-alerter-n52gc\" (UID: \"4e2c195f-e97d-4cac-81fc-2d5c551d1c30\") " pod="openshift-network-operator/iptables-alerter-n52gc" Mar 19 11:52:50.298237 master-0 kubenswrapper[4029]: I0319 11:52:50.298184 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/163d6a3d-0080-4122-bb7a-17f6e63f00f0-bound-sa-token\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:50.315937 master-0 kubenswrapper[4029]: I0319 11:52:50.315852 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p4hg\" (UniqueName: \"kubernetes.io/projected/6611e325-6152-480c-9c2c-1b503e49ccd2-kube-api-access-4p4hg\") pod \"cluster-olm-operator-67dcd4998-rgbzk\" (UID: \"6611e325-6152-480c-9c2c-1b503e49ccd2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" Mar 19 11:52:50.341685 master-0 kubenswrapper[4029]: I0319 11:52:50.341227 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5"] Mar 19 11:52:50.344121 master-0 kubenswrapper[4029]: I0319 11:52:50.341897 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5fnx\" (UniqueName: \"kubernetes.io/projected/66f88242-8b0b-4790-bbb6-445c19b34ee7-kube-api-access-p5fnx\") pod \"openshift-apiserver-operator-d65958b8-6hsqn\" (UID: \"66f88242-8b0b-4790-bbb6-445c19b34ee7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" Mar 19 11:52:50.351744 master-0 kubenswrapper[4029]: I0319 11:52:50.349525 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" Mar 19 11:52:50.356707 master-0 kubenswrapper[4029]: I0319 11:52:50.356657 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7spvn\" (UniqueName: \"kubernetes.io/projected/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-kube-api-access-7spvn\") pod \"multus-admission-controller-5dbbb8b86f-wdwkz\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:52:50.383406 master-0 kubenswrapper[4029]: I0319 11:52:50.383004 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nds54\" (UniqueName: \"kubernetes.io/projected/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-kube-api-access-nds54\") pod \"openshift-controller-manager-operator-8c94f4649-6ghdm\" (UID: \"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" Mar 19 11:52:50.393702 master-0 kubenswrapper[4029]: I0319 11:52:50.393644 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnl28\" (UniqueName: \"kubernetes.io/projected/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-kube-api-access-dnl28\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:50.419745 master-0 kubenswrapper[4029]: I0319 11:52:50.419686 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-4wj9n\" (UID: \"9b61ea14-a7ea-49f3-9df4-5655765ddf7c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" Mar 19 11:52:50.431551 master-0 kubenswrapper[4029]: I0319 11:52:50.431451 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" Mar 19 11:52:50.441288 master-0 kubenswrapper[4029]: I0319 11:52:50.440666 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrgqb\" (UniqueName: \"kubernetes.io/projected/a3ceeece-bee9-4fcb-8517-95ebce38e223-kube-api-access-zrgqb\") pod \"openshift-config-operator-95bf4f4d-ng9ss\" (UID: \"a3ceeece-bee9-4fcb-8517-95ebce38e223\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:52:50.442070 master-0 kubenswrapper[4029]: I0319 11:52:50.442004 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-fx8ng"] Mar 19 11:52:50.453225 master-0 kubenswrapper[4029]: W0319 11:52:50.452739 4029 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2292109e_92a9_4286_858e_dcd2ac083c43.slice/crio-7444b7503d7740b7e0cd43f84f6cce1196456b0e8df5c1dc67a1f73e2797cf61 WatchSource:0}: Error finding container 7444b7503d7740b7e0cd43f84f6cce1196456b0e8df5c1dc67a1f73e2797cf61: Status 404 returned error can't find the container with id 7444b7503d7740b7e0cd43f84f6cce1196456b0e8df5c1dc67a1f73e2797cf61 Mar 19 11:52:50.458126 master-0 kubenswrapper[4029]: I0319 11:52:50.458056 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" Mar 19 11:52:50.460629 master-0 kubenswrapper[4029]: I0319 11:52:50.459984 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/39d3ac31-9259-454b-8e1c-e23024f8f2b2-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-pjc7h\" (UID: \"39d3ac31-9259-454b-8e1c-e23024f8f2b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" Mar 19 11:52:50.466817 master-0 kubenswrapper[4029]: I0319 11:52:50.466291 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-n52gc" Mar 19 11:52:50.470317 master-0 kubenswrapper[4029]: I0319 11:52:50.470276 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94"] Mar 19 11:52:50.496790 master-0 kubenswrapper[4029]: I0319 11:52:50.496294 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls\") pod \"dns-operator-9c5679d8f-965np\" (UID: \"22e10648-af7c-409e-b947-570e7d807e05\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:52:50.496790 master-0 kubenswrapper[4029]: I0319 11:52:50.496340 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert\") pod \"catalog-operator-68f85b4d6c-n5gr9\" (UID: \"cf08ab4f-c203-4c16-9826-8cc049f4af31\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:52:50.496790 master-0 kubenswrapper[4029]: I0319 11:52:50.496414 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-jq5vq\" (UID: \"e5078f17-bc65-460f-9f18-8c506db6840b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:52:50.496790 master-0 kubenswrapper[4029]: I0319 11:52:50.496434 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert\") pod \"olm-operator-5c9796789-l9sw9\" (UID: \"716c2176-50f9-4c4f-af0e-4c7973457df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:52:50.496790 master-0 kubenswrapper[4029]: E0319 11:52:50.496584 4029 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 11:52:50.496790 master-0 kubenswrapper[4029]: E0319 11:52:50.496632 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert podName:716c2176-50f9-4c4f-af0e-4c7973457df2 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:51.496617104 +0000 UTC m=+116.573493671 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert") pod "olm-operator-5c9796789-l9sw9" (UID: "716c2176-50f9-4c4f-af0e-4c7973457df2") : secret "olm-operator-serving-cert" not found Mar 19 11:52:50.496790 master-0 kubenswrapper[4029]: E0319 11:52:50.496673 4029 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:52:50.496790 master-0 kubenswrapper[4029]: E0319 11:52:50.496698 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls podName:22e10648-af7c-409e-b947-570e7d807e05 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:51.496690256 +0000 UTC m=+116.573566823 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls") pod "dns-operator-9c5679d8f-965np" (UID: "22e10648-af7c-409e-b947-570e7d807e05") : secret "metrics-tls" not found Mar 19 11:52:50.496790 master-0 kubenswrapper[4029]: E0319 11:52:50.496776 4029 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 11:52:50.496790 master-0 kubenswrapper[4029]: E0319 11:52:50.496798 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert podName:cf08ab4f-c203-4c16-9826-8cc049f4af31 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:51.496791748 +0000 UTC m=+116.573668315 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert") pod "catalog-operator-68f85b4d6c-n5gr9" (UID: "cf08ab4f-c203-4c16-9826-8cc049f4af31") : secret "catalog-operator-serving-cert" not found Mar 19 11:52:50.497707 master-0 kubenswrapper[4029]: E0319 11:52:50.496832 4029 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 11:52:50.497707 master-0 kubenswrapper[4029]: E0319 11:52:50.496857 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert podName:e5078f17-bc65-460f-9f18-8c506db6840b nodeName:}" failed. No retries permitted until 2026-03-19 11:52:51.496849089 +0000 UTC m=+116.573725656 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-jq5vq" (UID: "e5078f17-bc65-460f-9f18-8c506db6840b") : secret "package-server-manager-serving-cert" not found Mar 19 11:52:50.501625 master-0 kubenswrapper[4029]: I0319 11:52:50.501585 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nfnb\" (UniqueName: \"kubernetes.io/projected/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-kube-api-access-7nfnb\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:50.521783 master-0 kubenswrapper[4029]: I0319 11:52:50.520298 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7tc5\" (UniqueName: \"kubernetes.io/projected/163d6a3d-0080-4122-bb7a-17f6e63f00f0-kube-api-access-m7tc5\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:50.546151 master-0 kubenswrapper[4029]: I0319 11:52:50.545888 4029 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlr9q\" (UniqueName: \"kubernetes.io/projected/b3de8a1b-a5be-414f-86e8-738e16c8bc97-kube-api-access-nlr9q\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:52:50.579986 master-0 kubenswrapper[4029]: I0319 11:52:50.579912 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" Mar 19 11:52:50.619302 master-0 kubenswrapper[4029]: I0319 11:52:50.619261 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn"] Mar 19 11:52:50.660944 master-0 kubenswrapper[4029]: I0319 11:52:50.654364 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:50.665852 master-0 kubenswrapper[4029]: I0319 11:52:50.664780 4029 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 11:52:50.665852 master-0 kubenswrapper[4029]: I0319 11:52:50.665284 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" Mar 19 11:52:50.673166 master-0 kubenswrapper[4029]: I0319 11:52:50.669996 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:50.693122 master-0 kubenswrapper[4029]: I0319 11:52:50.679903 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" Mar 19 11:52:50.700805 master-0 kubenswrapper[4029]: I0319 11:52:50.698905 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" Mar 19 11:52:50.700805 master-0 kubenswrapper[4029]: I0319 11:52:50.699577 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:52:50.700805 master-0 kubenswrapper[4029]: I0319 11:52:50.699631 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:50.700805 master-0 kubenswrapper[4029]: I0319 11:52:50.699660 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:50.700805 master-0 kubenswrapper[4029]: I0319 11:52:50.699686 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-wdwkz\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:52:50.700805 master-0 kubenswrapper[4029]: I0319 11:52:50.699793 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:50.700805 master-0 kubenswrapper[4029]: E0319 11:52:50.699911 4029 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 11:52:50.700805 master-0 kubenswrapper[4029]: E0319 11:52:50.699956 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls podName:aaaaf539-bf61-44d7-8d47-97535b7aa1ba nodeName:}" failed. No retries permitted until 2026-03-19 11:52:51.699936254 +0000 UTC m=+116.776812821 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-kb5vd" (UID: "aaaaf539-bf61-44d7-8d47-97535b7aa1ba") : secret "node-tuning-operator-tls" not found Mar 19 11:52:50.700805 master-0 kubenswrapper[4029]: E0319 11:52:50.699999 4029 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 11:52:50.700805 master-0 kubenswrapper[4029]: E0319 11:52:50.700018 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls podName:681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:51.700012656 +0000 UTC m=+116.776889223 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-tkcwh" (UID: "681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9") : secret "cluster-monitoring-operator-tls" not found Mar 19 11:52:50.700805 master-0 kubenswrapper[4029]: E0319 11:52:50.700050 4029 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 11:52:50.700805 master-0 kubenswrapper[4029]: E0319 11:52:50.700066 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert podName:aaaaf539-bf61-44d7-8d47-97535b7aa1ba nodeName:}" failed. No retries permitted until 2026-03-19 11:52:51.700060747 +0000 UTC m=+116.776937314 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-kb5vd" (UID: "aaaaf539-bf61-44d7-8d47-97535b7aa1ba") : secret "performance-addon-operator-webhook-cert" not found Mar 19 11:52:50.700805 master-0 kubenswrapper[4029]: E0319 11:52:50.700098 4029 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:52:50.700805 master-0 kubenswrapper[4029]: E0319 11:52:50.700114 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls podName:163d6a3d-0080-4122-bb7a-17f6e63f00f0 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:51.700108738 +0000 UTC m=+116.776985305 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls") pod "ingress-operator-66b84d69b-qrjj4" (UID: "163d6a3d-0080-4122-bb7a-17f6e63f00f0") : secret "metrics-tls" not found Mar 19 11:52:50.700805 master-0 kubenswrapper[4029]: E0319 11:52:50.700148 4029 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 11:52:50.701518 master-0 kubenswrapper[4029]: E0319 11:52:50.700163 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs podName:89cf2ee8-3664-4502-b70c-b7e0a5e92cb7 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:51.700158209 +0000 UTC m=+116.777034776 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-wdwkz" (UID: "89cf2ee8-3664-4502-b70c-b7e0a5e92cb7") : secret "multus-admission-controller-secret" not found Mar 19 11:52:50.719121 master-0 kubenswrapper[4029]: I0319 11:52:50.712818 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:52:50.757791 master-0 kubenswrapper[4029]: I0319 11:52:50.751661 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w"] Mar 19 11:52:50.757791 master-0 kubenswrapper[4029]: I0319 11:52:50.753406 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh"] Mar 19 11:52:50.804296 master-0 kubenswrapper[4029]: I0319 11:52:50.804187 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:50.804391 master-0 kubenswrapper[4029]: I0319 11:52:50.804305 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:52:50.804881 master-0 kubenswrapper[4029]: E0319 11:52:50.804853 4029 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 11:52:50.804939 master-0 kubenswrapper[4029]: E0319 11:52:50.804900 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls podName:a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:51.804887801 +0000 UTC m=+116.881764368 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-nrtp2" (UID: "a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1") : secret "image-registry-operator-tls" not found Mar 19 11:52:50.805042 master-0 kubenswrapper[4029]: E0319 11:52:50.804999 4029 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 11:52:50.805113 master-0 kubenswrapper[4029]: E0319 11:52:50.805096 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics podName:b3de8a1b-a5be-414f-86e8-738e16c8bc97 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:51.805073185 +0000 UTC m=+116.881949902 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-bftt4" (UID: "b3de8a1b-a5be-414f-86e8-738e16c8bc97") : secret "marketplace-operator-metrics" not found Mar 19 11:52:50.806527 master-0 kubenswrapper[4029]: I0319 11:52:50.806474 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk"] Mar 19 11:52:50.982479 master-0 kubenswrapper[4029]: I0319 11:52:50.982418 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h"] Mar 19 11:52:50.984447 master-0 kubenswrapper[4029]: I0319 11:52:50.984153 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n"] Mar 19 11:52:50.998635 master-0 kubenswrapper[4029]: I0319 11:52:50.998447 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc"] Mar 19 11:52:51.003841 master-0 kubenswrapper[4029]: W0319 11:52:51.003803 4029 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fe4839d_cef4_4ec9_b146_2ae9b76d8a76.slice/crio-159515e88e5e657c5dc1a45dfc38f542a76bac41085e0be14941a32b19e214ef WatchSource:0}: Error finding container 159515e88e5e657c5dc1a45dfc38f542a76bac41085e0be14941a32b19e214ef: Status 404 returned error can't find the container with id 159515e88e5e657c5dc1a45dfc38f542a76bac41085e0be14941a32b19e214ef Mar 19 11:52:51.037889 master-0 kubenswrapper[4029]: I0319 11:52:51.037819 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss"] Mar 19 11:52:51.041864 master-0 kubenswrapper[4029]: W0319 11:52:51.041803 4029 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3ceeece_bee9_4fcb_8517_95ebce38e223.slice/crio-28a32f59656edf5ebf4428eb19343f79c79bdc3e9a5ed63a5fa7185ccacbd30e WatchSource:0}: Error finding container 28a32f59656edf5ebf4428eb19343f79c79bdc3e9a5ed63a5fa7185ccacbd30e: Status 404 returned error can't find the container with id 28a32f59656edf5ebf4428eb19343f79c79bdc3e9a5ed63a5fa7185ccacbd30e Mar 19 11:52:51.048032 master-0 kubenswrapper[4029]: I0319 11:52:51.047994 4029 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm"] Mar 19 11:52:51.054751 master-0 kubenswrapper[4029]: W0319 11:52:51.054685 4029 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7f0a5ee_5e7a_4946_bffa_5d98aa5890bf.slice/crio-8797022c969de9642db09ed804cdf4aed14c8648d4f8b5b9c9f88a55664979e8 WatchSource:0}: Error finding container 8797022c969de9642db09ed804cdf4aed14c8648d4f8b5b9c9f88a55664979e8: Status 404 returned error can't find the container with id 8797022c969de9642db09ed804cdf4aed14c8648d4f8b5b9c9f88a55664979e8 Mar 19 11:52:51.247640 master-0 kubenswrapper[4029]: I0319 11:52:51.247543 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" event={"ID":"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf","Type":"ContainerStarted","Data":"8797022c969de9642db09ed804cdf4aed14c8648d4f8b5b9c9f88a55664979e8"} Mar 19 11:52:51.248709 master-0 kubenswrapper[4029]: I0319 11:52:51.248667 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" event={"ID":"9b61ea14-a7ea-49f3-9df4-5655765ddf7c","Type":"ContainerStarted","Data":"27419838ac6bb228f6151c74e466e550ee30c7ce1c14772f63c150dcd524d6e7"} Mar 19 11:52:51.249955 master-0 kubenswrapper[4029]: I0319 11:52:51.249781 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-n52gc" event={"ID":"4e2c195f-e97d-4cac-81fc-2d5c551d1c30","Type":"ContainerStarted","Data":"25dfae9bb0843173d90c844dacf16818cb3d6d61cb972bb6cd1177b47a320778"} Mar 19 11:52:51.251244 master-0 kubenswrapper[4029]: I0319 11:52:51.251214 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" event={"ID":"66f88242-8b0b-4790-bbb6-445c19b34ee7","Type":"ContainerStarted","Data":"5359d955256489cf75babf6cd7e374f24ca5753414f295ec115bac354fbe37e1"} Mar 19 11:52:51.252410 master-0 kubenswrapper[4029]: I0319 11:52:51.252355 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" event={"ID":"a3ceeece-bee9-4fcb-8517-95ebce38e223","Type":"ContainerStarted","Data":"28a32f59656edf5ebf4428eb19343f79c79bdc3e9a5ed63a5fa7185ccacbd30e"} Mar 19 11:52:51.253348 master-0 kubenswrapper[4029]: I0319 11:52:51.253308 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" event={"ID":"6611e325-6152-480c-9c2c-1b503e49ccd2","Type":"ContainerStarted","Data":"d9bf0e017da39714ca0d58a2ba0c46cd89a43ae7f317d13dbb6e31831feeb576"} Mar 19 11:52:51.254168 master-0 kubenswrapper[4029]: I0319 11:52:51.254131 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" event={"ID":"b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d","Type":"ContainerStarted","Data":"857137fd3aca8af8c5c19bcaeff329a322e9a54b7ff7f19d360c176d0e68cab5"} Mar 19 11:52:51.255259 master-0 kubenswrapper[4029]: I0319 11:52:51.255228 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" event={"ID":"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76","Type":"ContainerStarted","Data":"159515e88e5e657c5dc1a45dfc38f542a76bac41085e0be14941a32b19e214ef"} Mar 19 11:52:51.256583 master-0 kubenswrapper[4029]: I0319 11:52:51.256538 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" event={"ID":"39d3ac31-9259-454b-8e1c-e23024f8f2b2","Type":"ContainerStarted","Data":"5e2f36e1befc8e73ca3645b7b8f74e7be8e2177e72629e38b72062f0d512ab82"} Mar 19 11:52:51.256649 master-0 kubenswrapper[4029]: I0319 11:52:51.256592 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" event={"ID":"39d3ac31-9259-454b-8e1c-e23024f8f2b2","Type":"ContainerStarted","Data":"67cce31157aba8cd32c19fdb97a814cf6764c07048a060294135b6ce20e85f0e"} Mar 19 11:52:51.258523 master-0 kubenswrapper[4029]: I0319 11:52:51.258471 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" event={"ID":"dbcbba74-ac53-4724-a217-4d9b85e7c1db","Type":"ContainerStarted","Data":"74d7d2df3602ec247c94c7641e1ca1523b5ae6b42624ca797fbd2b6225dfbfa4"} Mar 19 11:52:51.259620 master-0 kubenswrapper[4029]: I0319 11:52:51.259469 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-fx8ng" event={"ID":"2292109e-92a9-4286-858e-dcd2ac083c43","Type":"ContainerStarted","Data":"7444b7503d7740b7e0cd43f84f6cce1196456b0e8df5c1dc67a1f73e2797cf61"} Mar 19 11:52:51.266563 master-0 kubenswrapper[4029]: I0319 11:52:51.266515 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" event={"ID":"f5d73fef-1414-4b29-97ea-42e1c0b1ef18","Type":"ContainerStarted","Data":"ee84c91e209b8d15a57102d97b9b923b7a0a0247657f697f48616f38ce178b0b"} Mar 19 11:52:51.267644 master-0 kubenswrapper[4029]: I0319 11:52:51.267619 4029 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" event={"ID":"732989c5-1b89-46f0-9917-b68613f7f005","Type":"ContainerStarted","Data":"b54d0875a5c74a95cdb12684066d437927027dd749aa30fdd27e9a88de808b47"} Mar 19 11:52:51.512940 master-0 kubenswrapper[4029]: I0319 11:52:51.512883 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls\") pod \"dns-operator-9c5679d8f-965np\" (UID: \"22e10648-af7c-409e-b947-570e7d807e05\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:52:51.512940 master-0 kubenswrapper[4029]: I0319 11:52:51.512940 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert\") pod \"catalog-operator-68f85b4d6c-n5gr9\" (UID: \"cf08ab4f-c203-4c16-9826-8cc049f4af31\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:52:51.513226 master-0 kubenswrapper[4029]: E0319 11:52:51.513125 4029 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:52:51.513226 master-0 kubenswrapper[4029]: E0319 11:52:51.513215 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls podName:22e10648-af7c-409e-b947-570e7d807e05 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:53.513194006 +0000 UTC m=+118.590070743 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls") pod "dns-operator-9c5679d8f-965np" (UID: "22e10648-af7c-409e-b947-570e7d807e05") : secret "metrics-tls" not found Mar 19 11:52:51.513323 master-0 kubenswrapper[4029]: I0319 11:52:51.513294 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-jq5vq\" (UID: \"e5078f17-bc65-460f-9f18-8c506db6840b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:52:51.513375 master-0 kubenswrapper[4029]: I0319 11:52:51.513353 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert\") pod \"olm-operator-5c9796789-l9sw9\" (UID: \"716c2176-50f9-4c4f-af0e-4c7973457df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:52:51.513631 master-0 kubenswrapper[4029]: E0319 11:52:51.513486 4029 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 11:52:51.513631 master-0 kubenswrapper[4029]: E0319 11:52:51.513526 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert podName:e5078f17-bc65-460f-9f18-8c506db6840b nodeName:}" failed. No retries permitted until 2026-03-19 11:52:53.513515823 +0000 UTC m=+118.590392590 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-jq5vq" (UID: "e5078f17-bc65-460f-9f18-8c506db6840b") : secret "package-server-manager-serving-cert" not found Mar 19 11:52:51.513631 master-0 kubenswrapper[4029]: E0319 11:52:51.513576 4029 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 11:52:51.513631 master-0 kubenswrapper[4029]: E0319 11:52:51.513580 4029 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 11:52:51.513631 master-0 kubenswrapper[4029]: E0319 11:52:51.513603 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert podName:cf08ab4f-c203-4c16-9826-8cc049f4af31 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:53.513593935 +0000 UTC m=+118.590470742 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert") pod "catalog-operator-68f85b4d6c-n5gr9" (UID: "cf08ab4f-c203-4c16-9826-8cc049f4af31") : secret "catalog-operator-serving-cert" not found Mar 19 11:52:51.513891 master-0 kubenswrapper[4029]: E0319 11:52:51.513642 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert podName:716c2176-50f9-4c4f-af0e-4c7973457df2 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:53.513622646 +0000 UTC m=+118.590499423 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert") pod "olm-operator-5c9796789-l9sw9" (UID: "716c2176-50f9-4c4f-af0e-4c7973457df2") : secret "olm-operator-serving-cert" not found Mar 19 11:52:51.652191 master-0 kubenswrapper[4029]: I0319 11:52:51.652143 4029 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:51.653826 master-0 kubenswrapper[4029]: I0319 11:52:51.653795 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 11:52:51.654113 master-0 kubenswrapper[4029]: I0319 11:52:51.654095 4029 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 11:52:51.715339 master-0 kubenswrapper[4029]: I0319 11:52:51.715279 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:51.715339 master-0 kubenswrapper[4029]: I0319 11:52:51.715339 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:52:51.716372 master-0 kubenswrapper[4029]: I0319 11:52:51.715369 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:51.716372 master-0 kubenswrapper[4029]: I0319 11:52:51.715387 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:51.716372 master-0 kubenswrapper[4029]: I0319 11:52:51.715404 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-wdwkz\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:52:51.716372 master-0 kubenswrapper[4029]: E0319 11:52:51.715541 4029 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 11:52:51.716372 master-0 kubenswrapper[4029]: E0319 11:52:51.715583 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs podName:89cf2ee8-3664-4502-b70c-b7e0a5e92cb7 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:53.715570704 +0000 UTC m=+118.792447271 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-wdwkz" (UID: "89cf2ee8-3664-4502-b70c-b7e0a5e92cb7") : secret "multus-admission-controller-secret" not found Mar 19 11:52:51.716372 master-0 kubenswrapper[4029]: E0319 11:52:51.715846 4029 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 11:52:51.716372 master-0 kubenswrapper[4029]: E0319 11:52:51.715880 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls podName:681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:53.715870632 +0000 UTC m=+118.792747199 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-tkcwh" (UID: "681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9") : secret "cluster-monitoring-operator-tls" not found Mar 19 11:52:51.716372 master-0 kubenswrapper[4029]: E0319 11:52:51.715924 4029 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 11:52:51.716372 master-0 kubenswrapper[4029]: E0319 11:52:51.715945 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls podName:aaaaf539-bf61-44d7-8d47-97535b7aa1ba nodeName:}" failed. No retries permitted until 2026-03-19 11:52:53.715938103 +0000 UTC m=+118.792814680 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-kb5vd" (UID: "aaaaf539-bf61-44d7-8d47-97535b7aa1ba") : secret "node-tuning-operator-tls" not found Mar 19 11:52:51.716372 master-0 kubenswrapper[4029]: E0319 11:52:51.715973 4029 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:52:51.716372 master-0 kubenswrapper[4029]: E0319 11:52:51.715989 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls podName:163d6a3d-0080-4122-bb7a-17f6e63f00f0 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:53.715984144 +0000 UTC m=+118.792860711 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls") pod "ingress-operator-66b84d69b-qrjj4" (UID: "163d6a3d-0080-4122-bb7a-17f6e63f00f0") : secret "metrics-tls" not found Mar 19 11:52:51.716372 master-0 kubenswrapper[4029]: E0319 11:52:51.715926 4029 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 11:52:51.716372 master-0 kubenswrapper[4029]: E0319 11:52:51.716010 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert podName:aaaaf539-bf61-44d7-8d47-97535b7aa1ba nodeName:}" failed. No retries permitted until 2026-03-19 11:52:53.716004865 +0000 UTC m=+118.792881432 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-kb5vd" (UID: "aaaaf539-bf61-44d7-8d47-97535b7aa1ba") : secret "performance-addon-operator-webhook-cert" not found Mar 19 11:52:51.816879 master-0 kubenswrapper[4029]: I0319 11:52:51.816754 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:51.816879 master-0 kubenswrapper[4029]: I0319 11:52:51.816818 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:52:51.817105 master-0 kubenswrapper[4029]: E0319 11:52:51.817015 4029 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 11:52:51.817105 master-0 kubenswrapper[4029]: E0319 11:52:51.817072 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls podName:a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:53.817057361 +0000 UTC m=+118.893933928 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-nrtp2" (UID: "a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1") : secret "image-registry-operator-tls" not found Mar 19 11:52:51.817388 master-0 kubenswrapper[4029]: E0319 11:52:51.817349 4029 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 11:52:51.817443 master-0 kubenswrapper[4029]: E0319 11:52:51.817391 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics podName:b3de8a1b-a5be-414f-86e8-738e16c8bc97 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:53.817380518 +0000 UTC m=+118.894257285 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-bftt4" (UID: "b3de8a1b-a5be-414f-86e8-738e16c8bc97") : secret "marketplace-operator-metrics" not found Mar 19 11:52:53.545752 master-0 kubenswrapper[4029]: I0319 11:52:53.545055 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls\") pod \"dns-operator-9c5679d8f-965np\" (UID: \"22e10648-af7c-409e-b947-570e7d807e05\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:52:53.545752 master-0 kubenswrapper[4029]: I0319 11:52:53.545432 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert\") pod \"catalog-operator-68f85b4d6c-n5gr9\" (UID: \"cf08ab4f-c203-4c16-9826-8cc049f4af31\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:52:53.545752 master-0 kubenswrapper[4029]: I0319 11:52:53.545459 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-jq5vq\" (UID: \"e5078f17-bc65-460f-9f18-8c506db6840b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:52:53.545752 master-0 kubenswrapper[4029]: E0319 11:52:53.545324 4029 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:52:53.545752 master-0 kubenswrapper[4029]: E0319 11:52:53.545598 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls podName:22e10648-af7c-409e-b947-570e7d807e05 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:57.545575091 +0000 UTC m=+122.622451658 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls") pod "dns-operator-9c5679d8f-965np" (UID: "22e10648-af7c-409e-b947-570e7d807e05") : secret "metrics-tls" not found Mar 19 11:52:53.545752 master-0 kubenswrapper[4029]: E0319 11:52:53.545604 4029 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 11:52:53.545752 master-0 kubenswrapper[4029]: E0319 11:52:53.545634 4029 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 11:52:53.545752 master-0 kubenswrapper[4029]: E0319 11:52:53.545641 4029 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 11:52:53.545752 master-0 kubenswrapper[4029]: E0319 11:52:53.545644 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert podName:cf08ab4f-c203-4c16-9826-8cc049f4af31 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:57.545633842 +0000 UTC m=+122.622510409 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert") pod "catalog-operator-68f85b4d6c-n5gr9" (UID: "cf08ab4f-c203-4c16-9826-8cc049f4af31") : secret "catalog-operator-serving-cert" not found Mar 19 11:52:53.545752 master-0 kubenswrapper[4029]: I0319 11:52:53.545513 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert\") pod \"olm-operator-5c9796789-l9sw9\" (UID: \"716c2176-50f9-4c4f-af0e-4c7973457df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:52:53.545752 master-0 kubenswrapper[4029]: E0319 11:52:53.545702 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert podName:716c2176-50f9-4c4f-af0e-4c7973457df2 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:57.545685514 +0000 UTC m=+122.622562081 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert") pod "olm-operator-5c9796789-l9sw9" (UID: "716c2176-50f9-4c4f-af0e-4c7973457df2") : secret "olm-operator-serving-cert" not found Mar 19 11:52:53.545752 master-0 kubenswrapper[4029]: E0319 11:52:53.545713 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert podName:e5078f17-bc65-460f-9f18-8c506db6840b nodeName:}" failed. No retries permitted until 2026-03-19 11:52:57.545707574 +0000 UTC m=+122.622584141 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-jq5vq" (UID: "e5078f17-bc65-460f-9f18-8c506db6840b") : secret "package-server-manager-serving-cert" not found Mar 19 11:52:53.748791 master-0 kubenswrapper[4029]: I0319 11:52:53.747930 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:52:53.748791 master-0 kubenswrapper[4029]: I0319 11:52:53.748038 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:53.748791 master-0 kubenswrapper[4029]: I0319 11:52:53.748074 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:53.748791 master-0 kubenswrapper[4029]: I0319 11:52:53.748110 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-wdwkz\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:52:53.748791 master-0 kubenswrapper[4029]: I0319 11:52:53.748236 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:53.748791 master-0 kubenswrapper[4029]: E0319 11:52:53.748412 4029 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 11:52:53.748791 master-0 kubenswrapper[4029]: E0319 11:52:53.748510 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls podName:aaaaf539-bf61-44d7-8d47-97535b7aa1ba nodeName:}" failed. No retries permitted until 2026-03-19 11:52:57.748482262 +0000 UTC m=+122.825358879 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-kb5vd" (UID: "aaaaf539-bf61-44d7-8d47-97535b7aa1ba") : secret "node-tuning-operator-tls" not found Mar 19 11:52:53.749366 master-0 kubenswrapper[4029]: E0319 11:52:53.749224 4029 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 11:52:53.749366 master-0 kubenswrapper[4029]: E0319 11:52:53.749341 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls podName:681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:57.749318612 +0000 UTC m=+122.826195219 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-tkcwh" (UID: "681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9") : secret "cluster-monitoring-operator-tls" not found Mar 19 11:52:53.749366 master-0 kubenswrapper[4029]: E0319 11:52:53.749426 4029 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 11:52:53.749366 master-0 kubenswrapper[4029]: E0319 11:52:53.749470 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert podName:aaaaf539-bf61-44d7-8d47-97535b7aa1ba nodeName:}" failed. No retries permitted until 2026-03-19 11:52:57.749454715 +0000 UTC m=+122.826331322 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-kb5vd" (UID: "aaaaf539-bf61-44d7-8d47-97535b7aa1ba") : secret "performance-addon-operator-webhook-cert" not found Mar 19 11:52:53.749944 master-0 kubenswrapper[4029]: E0319 11:52:53.749551 4029 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:52:53.749944 master-0 kubenswrapper[4029]: E0319 11:52:53.749597 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls podName:163d6a3d-0080-4122-bb7a-17f6e63f00f0 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:57.749581168 +0000 UTC m=+122.826457775 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls") pod "ingress-operator-66b84d69b-qrjj4" (UID: "163d6a3d-0080-4122-bb7a-17f6e63f00f0") : secret "metrics-tls" not found Mar 19 11:52:53.749944 master-0 kubenswrapper[4029]: E0319 11:52:53.749681 4029 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 11:52:53.749944 master-0 kubenswrapper[4029]: E0319 11:52:53.749767 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs podName:89cf2ee8-3664-4502-b70c-b7e0a5e92cb7 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:57.749715301 +0000 UTC m=+122.826591908 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-wdwkz" (UID: "89cf2ee8-3664-4502-b70c-b7e0a5e92cb7") : secret "multus-admission-controller-secret" not found Mar 19 11:52:53.848986 master-0 kubenswrapper[4029]: I0319 11:52:53.848830 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:53.848986 master-0 kubenswrapper[4029]: I0319 11:52:53.848907 4029 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:52:53.849229 master-0 kubenswrapper[4029]: E0319 11:52:53.849134 4029 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 11:52:53.849229 master-0 kubenswrapper[4029]: E0319 11:52:53.849189 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics podName:b3de8a1b-a5be-414f-86e8-738e16c8bc97 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:57.84917249 +0000 UTC m=+122.926049057 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-bftt4" (UID: "b3de8a1b-a5be-414f-86e8-738e16c8bc97") : secret "marketplace-operator-metrics" not found Mar 19 11:52:53.849320 master-0 kubenswrapper[4029]: E0319 11:52:53.849245 4029 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 11:52:53.849320 master-0 kubenswrapper[4029]: E0319 11:52:53.849272 4029 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls podName:a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:57.849263132 +0000 UTC m=+122.926139699 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-nrtp2" (UID: "a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1") : secret "image-registry-operator-tls" not found Mar 19 11:52:55.495220 master-0 systemd[1]: Stopping Kubernetes Kubelet... Mar 19 11:52:55.561316 master-0 systemd[1]: kubelet.service: Deactivated successfully. Mar 19 11:52:55.561572 master-0 systemd[1]: Stopped Kubernetes Kubelet. Mar 19 11:52:55.562119 master-0 systemd[1]: kubelet.service: Consumed 8.060s CPU time. Mar 19 11:52:55.575211 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 19 11:52:55.749784 master-0 kubenswrapper[6932]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 11:52:55.749784 master-0 kubenswrapper[6932]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 19 11:52:55.749784 master-0 kubenswrapper[6932]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 11:52:55.749784 master-0 kubenswrapper[6932]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 11:52:55.749784 master-0 kubenswrapper[6932]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 19 11:52:55.749784 master-0 kubenswrapper[6932]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 11:52:55.750794 master-0 kubenswrapper[6932]: I0319 11:52:55.749935 6932 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 19 11:52:55.753210 master-0 kubenswrapper[6932]: W0319 11:52:55.753172 6932 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 11:52:55.753210 master-0 kubenswrapper[6932]: W0319 11:52:55.753200 6932 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 11:52:55.753210 master-0 kubenswrapper[6932]: W0319 11:52:55.753209 6932 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 11:52:55.753316 master-0 kubenswrapper[6932]: W0319 11:52:55.753218 6932 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 11:52:55.753316 master-0 kubenswrapper[6932]: W0319 11:52:55.753227 6932 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 11:52:55.753316 master-0 kubenswrapper[6932]: W0319 11:52:55.753234 6932 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 11:52:55.753316 master-0 kubenswrapper[6932]: W0319 11:52:55.753240 6932 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 11:52:55.753316 master-0 kubenswrapper[6932]: W0319 11:52:55.753247 6932 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 11:52:55.753316 master-0 kubenswrapper[6932]: W0319 11:52:55.753252 6932 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 11:52:55.753316 master-0 kubenswrapper[6932]: W0319 11:52:55.753258 6932 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 11:52:55.753316 master-0 kubenswrapper[6932]: W0319 11:52:55.753263 6932 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 11:52:55.753316 master-0 kubenswrapper[6932]: W0319 11:52:55.753269 6932 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 11:52:55.753316 master-0 kubenswrapper[6932]: W0319 11:52:55.753281 6932 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 11:52:55.753316 master-0 kubenswrapper[6932]: W0319 11:52:55.753286 6932 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 11:52:55.753316 master-0 kubenswrapper[6932]: W0319 11:52:55.753292 6932 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 11:52:55.753316 master-0 kubenswrapper[6932]: W0319 11:52:55.753298 6932 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 11:52:55.753316 master-0 kubenswrapper[6932]: W0319 11:52:55.753303 6932 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 11:52:55.753316 master-0 kubenswrapper[6932]: W0319 11:52:55.753308 6932 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 11:52:55.753316 master-0 kubenswrapper[6932]: W0319 11:52:55.753314 6932 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 11:52:55.753316 master-0 kubenswrapper[6932]: W0319 11:52:55.753320 6932 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 11:52:55.753316 master-0 kubenswrapper[6932]: W0319 11:52:55.753326 6932 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 11:52:55.753316 master-0 kubenswrapper[6932]: W0319 11:52:55.753332 6932 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 11:52:55.753316 master-0 kubenswrapper[6932]: W0319 11:52:55.753338 6932 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 11:52:55.753818 master-0 kubenswrapper[6932]: W0319 11:52:55.753344 6932 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 11:52:55.753818 master-0 kubenswrapper[6932]: W0319 11:52:55.753351 6932 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 11:52:55.753818 master-0 kubenswrapper[6932]: W0319 11:52:55.753356 6932 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 11:52:55.753818 master-0 kubenswrapper[6932]: W0319 11:52:55.753366 6932 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 11:52:55.753818 master-0 kubenswrapper[6932]: W0319 11:52:55.753373 6932 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 11:52:55.753818 master-0 kubenswrapper[6932]: W0319 11:52:55.753379 6932 feature_gate.go:330] unrecognized feature gate: Example Mar 19 11:52:55.753818 master-0 kubenswrapper[6932]: W0319 11:52:55.753385 6932 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 11:52:55.753818 master-0 kubenswrapper[6932]: W0319 11:52:55.753390 6932 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 11:52:55.753818 master-0 kubenswrapper[6932]: W0319 11:52:55.753396 6932 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 11:52:55.753818 master-0 kubenswrapper[6932]: W0319 11:52:55.753401 6932 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 11:52:55.753818 master-0 kubenswrapper[6932]: W0319 11:52:55.753406 6932 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 11:52:55.753818 master-0 kubenswrapper[6932]: W0319 11:52:55.753411 6932 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 11:52:55.753818 master-0 kubenswrapper[6932]: W0319 11:52:55.753417 6932 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 11:52:55.753818 master-0 kubenswrapper[6932]: W0319 11:52:55.753424 6932 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 11:52:55.753818 master-0 kubenswrapper[6932]: W0319 11:52:55.753429 6932 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 11:52:55.753818 master-0 kubenswrapper[6932]: W0319 11:52:55.753434 6932 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 11:52:55.753818 master-0 kubenswrapper[6932]: W0319 11:52:55.753441 6932 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 11:52:55.753818 master-0 kubenswrapper[6932]: W0319 11:52:55.753448 6932 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 11:52:55.753818 master-0 kubenswrapper[6932]: W0319 11:52:55.753453 6932 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 11:52:55.753818 master-0 kubenswrapper[6932]: W0319 11:52:55.753458 6932 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 11:52:55.754351 master-0 kubenswrapper[6932]: W0319 11:52:55.753466 6932 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 11:52:55.754351 master-0 kubenswrapper[6932]: W0319 11:52:55.753474 6932 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 11:52:55.754351 master-0 kubenswrapper[6932]: W0319 11:52:55.753479 6932 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 11:52:55.754351 master-0 kubenswrapper[6932]: W0319 11:52:55.753485 6932 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 11:52:55.754351 master-0 kubenswrapper[6932]: W0319 11:52:55.753490 6932 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 11:52:55.754351 master-0 kubenswrapper[6932]: W0319 11:52:55.753495 6932 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 11:52:55.754351 master-0 kubenswrapper[6932]: W0319 11:52:55.753501 6932 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 11:52:55.754351 master-0 kubenswrapper[6932]: W0319 11:52:55.753510 6932 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 11:52:55.754351 master-0 kubenswrapper[6932]: W0319 11:52:55.753515 6932 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 11:52:55.754351 master-0 kubenswrapper[6932]: W0319 11:52:55.753520 6932 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 11:52:55.754351 master-0 kubenswrapper[6932]: W0319 11:52:55.753525 6932 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 11:52:55.754351 master-0 kubenswrapper[6932]: W0319 11:52:55.753531 6932 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 11:52:55.754351 master-0 kubenswrapper[6932]: W0319 11:52:55.753536 6932 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 11:52:55.754351 master-0 kubenswrapper[6932]: W0319 11:52:55.753541 6932 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 11:52:55.754351 master-0 kubenswrapper[6932]: W0319 11:52:55.753546 6932 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 11:52:55.754351 master-0 kubenswrapper[6932]: W0319 11:52:55.753552 6932 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 11:52:55.754351 master-0 kubenswrapper[6932]: W0319 11:52:55.753557 6932 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 11:52:55.754351 master-0 kubenswrapper[6932]: W0319 11:52:55.753562 6932 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 11:52:55.754351 master-0 kubenswrapper[6932]: W0319 11:52:55.753567 6932 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 11:52:55.754351 master-0 kubenswrapper[6932]: W0319 11:52:55.753574 6932 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 11:52:55.754818 master-0 kubenswrapper[6932]: W0319 11:52:55.753579 6932 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 11:52:55.754818 master-0 kubenswrapper[6932]: W0319 11:52:55.753584 6932 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 11:52:55.754818 master-0 kubenswrapper[6932]: W0319 11:52:55.753589 6932 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 11:52:55.754818 master-0 kubenswrapper[6932]: W0319 11:52:55.753594 6932 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 11:52:55.754818 master-0 kubenswrapper[6932]: W0319 11:52:55.753599 6932 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 11:52:55.754818 master-0 kubenswrapper[6932]: W0319 11:52:55.753605 6932 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 11:52:55.754818 master-0 kubenswrapper[6932]: W0319 11:52:55.753610 6932 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 11:52:55.754818 master-0 kubenswrapper[6932]: W0319 11:52:55.753615 6932 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 11:52:55.754818 master-0 kubenswrapper[6932]: W0319 11:52:55.753620 6932 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 11:52:55.754818 master-0 kubenswrapper[6932]: I0319 11:52:55.753758 6932 flags.go:64] FLAG: --address="0.0.0.0" Mar 19 11:52:55.754818 master-0 kubenswrapper[6932]: I0319 11:52:55.753771 6932 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 19 11:52:55.754818 master-0 kubenswrapper[6932]: I0319 11:52:55.753782 6932 flags.go:64] FLAG: --anonymous-auth="true" Mar 19 11:52:55.754818 master-0 kubenswrapper[6932]: I0319 11:52:55.753790 6932 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 19 11:52:55.754818 master-0 kubenswrapper[6932]: I0319 11:52:55.753798 6932 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 19 11:52:55.754818 master-0 kubenswrapper[6932]: I0319 11:52:55.753806 6932 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 19 11:52:55.754818 master-0 kubenswrapper[6932]: I0319 11:52:55.753814 6932 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 19 11:52:55.754818 master-0 kubenswrapper[6932]: I0319 11:52:55.753822 6932 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 19 11:52:55.754818 master-0 kubenswrapper[6932]: I0319 11:52:55.753829 6932 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 19 11:52:55.754818 master-0 kubenswrapper[6932]: I0319 11:52:55.753865 6932 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 19 11:52:55.754818 master-0 kubenswrapper[6932]: I0319 11:52:55.753873 6932 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 19 11:52:55.754818 master-0 kubenswrapper[6932]: I0319 11:52:55.753880 6932 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.753890 6932 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.753896 6932 flags.go:64] FLAG: --cgroup-root="" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.753902 6932 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.753908 6932 flags.go:64] FLAG: --client-ca-file="" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.753914 6932 flags.go:64] FLAG: --cloud-config="" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.753920 6932 flags.go:64] FLAG: --cloud-provider="" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.753926 6932 flags.go:64] FLAG: --cluster-dns="[]" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.753934 6932 flags.go:64] FLAG: --cluster-domain="" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.753940 6932 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.753947 6932 flags.go:64] FLAG: --config-dir="" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.753953 6932 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.753959 6932 flags.go:64] FLAG: --container-log-max-files="5" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.753970 6932 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.753976 6932 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.753983 6932 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.753990 6932 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.753997 6932 flags.go:64] FLAG: --contention-profiling="false" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.754003 6932 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.754009 6932 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.754016 6932 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.754023 6932 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.754031 6932 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.754038 6932 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.754045 6932 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.754051 6932 flags.go:64] FLAG: --enable-load-reader="false" Mar 19 11:52:55.755288 master-0 kubenswrapper[6932]: I0319 11:52:55.754058 6932 flags.go:64] FLAG: --enable-server="true" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754065 6932 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754074 6932 flags.go:64] FLAG: --event-burst="100" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754082 6932 flags.go:64] FLAG: --event-qps="50" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754089 6932 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754095 6932 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754101 6932 flags.go:64] FLAG: --eviction-hard="" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754109 6932 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754115 6932 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754121 6932 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754127 6932 flags.go:64] FLAG: --eviction-soft="" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754133 6932 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754139 6932 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754145 6932 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754151 6932 flags.go:64] FLAG: --experimental-mounter-path="" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754157 6932 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754163 6932 flags.go:64] FLAG: --fail-swap-on="true" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754168 6932 flags.go:64] FLAG: --feature-gates="" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754176 6932 flags.go:64] FLAG: --file-check-frequency="20s" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754182 6932 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754188 6932 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754194 6932 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754200 6932 flags.go:64] FLAG: --healthz-port="10248" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754207 6932 flags.go:64] FLAG: --help="false" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754213 6932 flags.go:64] FLAG: --hostname-override="" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754219 6932 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 19 11:52:55.755910 master-0 kubenswrapper[6932]: I0319 11:52:55.754226 6932 flags.go:64] FLAG: --http-check-frequency="20s" Mar 19 11:52:55.756608 master-0 kubenswrapper[6932]: I0319 11:52:55.754232 6932 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 19 11:52:55.756608 master-0 kubenswrapper[6932]: I0319 11:52:55.754238 6932 flags.go:64] FLAG: --image-credential-provider-config="" Mar 19 11:52:55.756608 master-0 kubenswrapper[6932]: I0319 11:52:55.754245 6932 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 19 11:52:55.756608 master-0 kubenswrapper[6932]: I0319 11:52:55.754251 6932 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 19 11:52:55.756608 master-0 kubenswrapper[6932]: I0319 11:52:55.754257 6932 flags.go:64] FLAG: --image-service-endpoint="" Mar 19 11:52:55.756608 master-0 kubenswrapper[6932]: I0319 11:52:55.754263 6932 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 19 11:52:55.756608 master-0 kubenswrapper[6932]: I0319 11:52:55.754269 6932 flags.go:64] FLAG: --kube-api-burst="100" Mar 19 11:52:55.756608 master-0 kubenswrapper[6932]: I0319 11:52:55.754276 6932 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 19 11:52:55.756608 master-0 kubenswrapper[6932]: I0319 11:52:55.754282 6932 flags.go:64] FLAG: --kube-api-qps="50" Mar 19 11:52:55.756608 master-0 kubenswrapper[6932]: I0319 11:52:55.754288 6932 flags.go:64] FLAG: --kube-reserved="" Mar 19 11:52:55.756608 master-0 kubenswrapper[6932]: I0319 11:52:55.754293 6932 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 19 11:52:55.756608 master-0 kubenswrapper[6932]: I0319 11:52:55.754299 6932 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 19 11:52:55.756608 master-0 kubenswrapper[6932]: I0319 11:52:55.754306 6932 flags.go:64] FLAG: --kubelet-cgroups="" Mar 19 11:52:55.756608 master-0 kubenswrapper[6932]: I0319 11:52:55.754311 6932 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 19 11:52:55.756608 master-0 kubenswrapper[6932]: I0319 11:52:55.754317 6932 flags.go:64] FLAG: --lock-file="" Mar 19 11:52:55.756608 master-0 kubenswrapper[6932]: I0319 11:52:55.754323 6932 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 19 11:52:55.756608 master-0 kubenswrapper[6932]: I0319 11:52:55.754329 6932 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 19 11:52:55.756608 master-0 kubenswrapper[6932]: I0319 11:52:55.754335 6932 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 19 11:52:55.756608 master-0 kubenswrapper[6932]: I0319 11:52:55.754344 6932 flags.go:64] FLAG: --log-json-split-stream="false" Mar 19 11:52:55.756608 master-0 kubenswrapper[6932]: I0319 11:52:55.754350 6932 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 19 11:52:55.756608 master-0 kubenswrapper[6932]: I0319 11:52:55.754357 6932 flags.go:64] FLAG: --log-text-split-stream="false" Mar 19 11:52:55.756608 master-0 kubenswrapper[6932]: I0319 11:52:55.754363 6932 flags.go:64] FLAG: --logging-format="text" Mar 19 11:52:55.756608 master-0 kubenswrapper[6932]: I0319 11:52:55.754369 6932 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 19 11:52:55.756608 master-0 kubenswrapper[6932]: I0319 11:52:55.754376 6932 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 19 11:52:55.756608 master-0 kubenswrapper[6932]: I0319 11:52:55.754382 6932 flags.go:64] FLAG: --manifest-url="" Mar 19 11:52:55.757288 master-0 kubenswrapper[6932]: I0319 11:52:55.754388 6932 flags.go:64] FLAG: --manifest-url-header="" Mar 19 11:52:55.757288 master-0 kubenswrapper[6932]: I0319 11:52:55.754396 6932 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 19 11:52:55.757288 master-0 kubenswrapper[6932]: I0319 11:52:55.754402 6932 flags.go:64] FLAG: --max-open-files="1000000" Mar 19 11:52:55.757288 master-0 kubenswrapper[6932]: I0319 11:52:55.754410 6932 flags.go:64] FLAG: --max-pods="110" Mar 19 11:52:55.757288 master-0 kubenswrapper[6932]: I0319 11:52:55.754416 6932 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 19 11:52:55.757288 master-0 kubenswrapper[6932]: I0319 11:52:55.754422 6932 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 19 11:52:55.757288 master-0 kubenswrapper[6932]: I0319 11:52:55.754428 6932 flags.go:64] FLAG: --memory-manager-policy="None" Mar 19 11:52:55.757288 master-0 kubenswrapper[6932]: I0319 11:52:55.754434 6932 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 19 11:52:55.757288 master-0 kubenswrapper[6932]: I0319 11:52:55.754442 6932 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 19 11:52:55.757288 master-0 kubenswrapper[6932]: I0319 11:52:55.754448 6932 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 19 11:52:55.757288 master-0 kubenswrapper[6932]: I0319 11:52:55.754455 6932 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 19 11:52:55.757288 master-0 kubenswrapper[6932]: I0319 11:52:55.754473 6932 flags.go:64] FLAG: --node-status-max-images="50" Mar 19 11:52:55.757288 master-0 kubenswrapper[6932]: I0319 11:52:55.754480 6932 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 19 11:52:55.757288 master-0 kubenswrapper[6932]: I0319 11:52:55.754487 6932 flags.go:64] FLAG: --oom-score-adj="-999" Mar 19 11:52:55.757288 master-0 kubenswrapper[6932]: I0319 11:52:55.754493 6932 flags.go:64] FLAG: --pod-cidr="" Mar 19 11:52:55.757288 master-0 kubenswrapper[6932]: I0319 11:52:55.754499 6932 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:53d66d524ca3e787d8dbe30dbc4d9b8612c9cebd505ccb4375a8441814e85422" Mar 19 11:52:55.757288 master-0 kubenswrapper[6932]: I0319 11:52:55.754509 6932 flags.go:64] FLAG: --pod-manifest-path="" Mar 19 11:52:55.757288 master-0 kubenswrapper[6932]: I0319 11:52:55.754515 6932 flags.go:64] FLAG: --pod-max-pids="-1" Mar 19 11:52:55.757288 master-0 kubenswrapper[6932]: I0319 11:52:55.754522 6932 flags.go:64] FLAG: --pods-per-core="0" Mar 19 11:52:55.757288 master-0 kubenswrapper[6932]: I0319 11:52:55.754528 6932 flags.go:64] FLAG: --port="10250" Mar 19 11:52:55.757288 master-0 kubenswrapper[6932]: I0319 11:52:55.754534 6932 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 19 11:52:55.757288 master-0 kubenswrapper[6932]: I0319 11:52:55.754540 6932 flags.go:64] FLAG: --provider-id="" Mar 19 11:52:55.757288 master-0 kubenswrapper[6932]: I0319 11:52:55.754546 6932 flags.go:64] FLAG: --qos-reserved="" Mar 19 11:52:55.757288 master-0 kubenswrapper[6932]: I0319 11:52:55.754552 6932 flags.go:64] FLAG: --read-only-port="10255" Mar 19 11:52:55.758005 master-0 kubenswrapper[6932]: I0319 11:52:55.754558 6932 flags.go:64] FLAG: --register-node="true" Mar 19 11:52:55.758005 master-0 kubenswrapper[6932]: I0319 11:52:55.754563 6932 flags.go:64] FLAG: --register-schedulable="true" Mar 19 11:52:55.758005 master-0 kubenswrapper[6932]: I0319 11:52:55.754569 6932 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 19 11:52:55.758005 master-0 kubenswrapper[6932]: I0319 11:52:55.754585 6932 flags.go:64] FLAG: --registry-burst="10" Mar 19 11:52:55.758005 master-0 kubenswrapper[6932]: I0319 11:52:55.754591 6932 flags.go:64] FLAG: --registry-qps="5" Mar 19 11:52:55.758005 master-0 kubenswrapper[6932]: I0319 11:52:55.754597 6932 flags.go:64] FLAG: --reserved-cpus="" Mar 19 11:52:55.758005 master-0 kubenswrapper[6932]: I0319 11:52:55.754603 6932 flags.go:64] FLAG: --reserved-memory="" Mar 19 11:52:55.758005 master-0 kubenswrapper[6932]: I0319 11:52:55.754610 6932 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 19 11:52:55.758005 master-0 kubenswrapper[6932]: I0319 11:52:55.754617 6932 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 19 11:52:55.758005 master-0 kubenswrapper[6932]: I0319 11:52:55.754623 6932 flags.go:64] FLAG: --rotate-certificates="false" Mar 19 11:52:55.758005 master-0 kubenswrapper[6932]: I0319 11:52:55.754629 6932 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 19 11:52:55.758005 master-0 kubenswrapper[6932]: I0319 11:52:55.754636 6932 flags.go:64] FLAG: --runonce="false" Mar 19 11:52:55.758005 master-0 kubenswrapper[6932]: I0319 11:52:55.754642 6932 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 19 11:52:55.758005 master-0 kubenswrapper[6932]: I0319 11:52:55.754648 6932 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 19 11:52:55.758005 master-0 kubenswrapper[6932]: I0319 11:52:55.754655 6932 flags.go:64] FLAG: --seccomp-default="false" Mar 19 11:52:55.758005 master-0 kubenswrapper[6932]: I0319 11:52:55.754661 6932 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 19 11:52:55.758005 master-0 kubenswrapper[6932]: I0319 11:52:55.754667 6932 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 19 11:52:55.758005 master-0 kubenswrapper[6932]: I0319 11:52:55.754674 6932 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 19 11:52:55.758005 master-0 kubenswrapper[6932]: I0319 11:52:55.754680 6932 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 19 11:52:55.758005 master-0 kubenswrapper[6932]: I0319 11:52:55.754685 6932 flags.go:64] FLAG: --storage-driver-password="root" Mar 19 11:52:55.758005 master-0 kubenswrapper[6932]: I0319 11:52:55.754692 6932 flags.go:64] FLAG: --storage-driver-secure="false" Mar 19 11:52:55.758005 master-0 kubenswrapper[6932]: I0319 11:52:55.754700 6932 flags.go:64] FLAG: --storage-driver-table="stats" Mar 19 11:52:55.758005 master-0 kubenswrapper[6932]: I0319 11:52:55.754706 6932 flags.go:64] FLAG: --storage-driver-user="root" Mar 19 11:52:55.758005 master-0 kubenswrapper[6932]: I0319 11:52:55.754712 6932 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 19 11:52:55.758005 master-0 kubenswrapper[6932]: I0319 11:52:55.754718 6932 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 19 11:52:55.758815 master-0 kubenswrapper[6932]: I0319 11:52:55.754744 6932 flags.go:64] FLAG: --system-cgroups="" Mar 19 11:52:55.758815 master-0 kubenswrapper[6932]: I0319 11:52:55.754751 6932 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 19 11:52:55.758815 master-0 kubenswrapper[6932]: I0319 11:52:55.754760 6932 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 19 11:52:55.758815 master-0 kubenswrapper[6932]: I0319 11:52:55.754766 6932 flags.go:64] FLAG: --tls-cert-file="" Mar 19 11:52:55.758815 master-0 kubenswrapper[6932]: I0319 11:52:55.754772 6932 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 19 11:52:55.758815 master-0 kubenswrapper[6932]: I0319 11:52:55.754780 6932 flags.go:64] FLAG: --tls-min-version="" Mar 19 11:52:55.758815 master-0 kubenswrapper[6932]: I0319 11:52:55.754786 6932 flags.go:64] FLAG: --tls-private-key-file="" Mar 19 11:52:55.758815 master-0 kubenswrapper[6932]: I0319 11:52:55.754792 6932 flags.go:64] FLAG: --topology-manager-policy="none" Mar 19 11:52:55.758815 master-0 kubenswrapper[6932]: I0319 11:52:55.754798 6932 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 19 11:52:55.758815 master-0 kubenswrapper[6932]: I0319 11:52:55.754804 6932 flags.go:64] FLAG: --topology-manager-scope="container" Mar 19 11:52:55.758815 master-0 kubenswrapper[6932]: I0319 11:52:55.754810 6932 flags.go:64] FLAG: --v="2" Mar 19 11:52:55.758815 master-0 kubenswrapper[6932]: I0319 11:52:55.754818 6932 flags.go:64] FLAG: --version="false" Mar 19 11:52:55.758815 master-0 kubenswrapper[6932]: I0319 11:52:55.754827 6932 flags.go:64] FLAG: --vmodule="" Mar 19 11:52:55.758815 master-0 kubenswrapper[6932]: I0319 11:52:55.754834 6932 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 19 11:52:55.758815 master-0 kubenswrapper[6932]: I0319 11:52:55.754841 6932 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 19 11:52:55.758815 master-0 kubenswrapper[6932]: W0319 11:52:55.754986 6932 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 11:52:55.758815 master-0 kubenswrapper[6932]: W0319 11:52:55.754994 6932 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 11:52:55.758815 master-0 kubenswrapper[6932]: W0319 11:52:55.755001 6932 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 11:52:55.758815 master-0 kubenswrapper[6932]: W0319 11:52:55.755006 6932 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 11:52:55.758815 master-0 kubenswrapper[6932]: W0319 11:52:55.755012 6932 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 11:52:55.758815 master-0 kubenswrapper[6932]: W0319 11:52:55.755018 6932 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 11:52:55.758815 master-0 kubenswrapper[6932]: W0319 11:52:55.755023 6932 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 11:52:55.758815 master-0 kubenswrapper[6932]: W0319 11:52:55.755029 6932 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 11:52:55.759466 master-0 kubenswrapper[6932]: W0319 11:52:55.755073 6932 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 11:52:55.759466 master-0 kubenswrapper[6932]: W0319 11:52:55.755081 6932 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 11:52:55.759466 master-0 kubenswrapper[6932]: W0319 11:52:55.755088 6932 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 11:52:55.759466 master-0 kubenswrapper[6932]: W0319 11:52:55.755095 6932 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 11:52:55.759466 master-0 kubenswrapper[6932]: W0319 11:52:55.755101 6932 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 11:52:55.759466 master-0 kubenswrapper[6932]: W0319 11:52:55.755109 6932 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 11:52:55.759466 master-0 kubenswrapper[6932]: W0319 11:52:55.755116 6932 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 11:52:55.759466 master-0 kubenswrapper[6932]: W0319 11:52:55.755122 6932 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 11:52:55.759466 master-0 kubenswrapper[6932]: W0319 11:52:55.755129 6932 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 11:52:55.759466 master-0 kubenswrapper[6932]: W0319 11:52:55.755136 6932 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 11:52:55.759466 master-0 kubenswrapper[6932]: W0319 11:52:55.755142 6932 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 11:52:55.759466 master-0 kubenswrapper[6932]: W0319 11:52:55.755147 6932 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 11:52:55.759466 master-0 kubenswrapper[6932]: W0319 11:52:55.755153 6932 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 11:52:55.759466 master-0 kubenswrapper[6932]: W0319 11:52:55.755158 6932 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 11:52:55.759466 master-0 kubenswrapper[6932]: W0319 11:52:55.755164 6932 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 11:52:55.759466 master-0 kubenswrapper[6932]: W0319 11:52:55.755169 6932 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 11:52:55.759466 master-0 kubenswrapper[6932]: W0319 11:52:55.755174 6932 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 11:52:55.759466 master-0 kubenswrapper[6932]: W0319 11:52:55.755179 6932 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 11:52:55.759466 master-0 kubenswrapper[6932]: W0319 11:52:55.755184 6932 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 11:52:55.760162 master-0 kubenswrapper[6932]: W0319 11:52:55.755190 6932 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 11:52:55.760162 master-0 kubenswrapper[6932]: W0319 11:52:55.755195 6932 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 11:52:55.760162 master-0 kubenswrapper[6932]: W0319 11:52:55.755200 6932 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 11:52:55.760162 master-0 kubenswrapper[6932]: W0319 11:52:55.755206 6932 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 11:52:55.760162 master-0 kubenswrapper[6932]: W0319 11:52:55.755211 6932 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 11:52:55.760162 master-0 kubenswrapper[6932]: W0319 11:52:55.755216 6932 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 11:52:55.760162 master-0 kubenswrapper[6932]: W0319 11:52:55.755222 6932 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 11:52:55.760162 master-0 kubenswrapper[6932]: W0319 11:52:55.755227 6932 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 11:52:55.760162 master-0 kubenswrapper[6932]: W0319 11:52:55.755233 6932 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 11:52:55.760162 master-0 kubenswrapper[6932]: W0319 11:52:55.755238 6932 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 11:52:55.760162 master-0 kubenswrapper[6932]: W0319 11:52:55.755244 6932 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 11:52:55.760162 master-0 kubenswrapper[6932]: W0319 11:52:55.755249 6932 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 11:52:55.760162 master-0 kubenswrapper[6932]: W0319 11:52:55.755254 6932 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 11:52:55.760162 master-0 kubenswrapper[6932]: W0319 11:52:55.755262 6932 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 11:52:55.760162 master-0 kubenswrapper[6932]: W0319 11:52:55.755268 6932 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 11:52:55.760162 master-0 kubenswrapper[6932]: W0319 11:52:55.755274 6932 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 11:52:55.760162 master-0 kubenswrapper[6932]: W0319 11:52:55.755279 6932 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 11:52:55.760162 master-0 kubenswrapper[6932]: W0319 11:52:55.755293 6932 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 11:52:55.760162 master-0 kubenswrapper[6932]: W0319 11:52:55.755299 6932 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 11:52:55.760162 master-0 kubenswrapper[6932]: W0319 11:52:55.755305 6932 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 11:52:55.760660 master-0 kubenswrapper[6932]: W0319 11:52:55.755311 6932 feature_gate.go:330] unrecognized feature gate: Example Mar 19 11:52:55.760660 master-0 kubenswrapper[6932]: W0319 11:52:55.755317 6932 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 11:52:55.760660 master-0 kubenswrapper[6932]: W0319 11:52:55.755322 6932 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 11:52:55.760660 master-0 kubenswrapper[6932]: W0319 11:52:55.755328 6932 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 11:52:55.760660 master-0 kubenswrapper[6932]: W0319 11:52:55.755338 6932 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 11:52:55.760660 master-0 kubenswrapper[6932]: W0319 11:52:55.755343 6932 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 11:52:55.760660 master-0 kubenswrapper[6932]: W0319 11:52:55.755350 6932 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 11:52:55.760660 master-0 kubenswrapper[6932]: W0319 11:52:55.755356 6932 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 11:52:55.760660 master-0 kubenswrapper[6932]: W0319 11:52:55.755361 6932 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 11:52:55.760660 master-0 kubenswrapper[6932]: W0319 11:52:55.755366 6932 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 11:52:55.760660 master-0 kubenswrapper[6932]: W0319 11:52:55.755371 6932 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 11:52:55.760660 master-0 kubenswrapper[6932]: W0319 11:52:55.755377 6932 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 11:52:55.760660 master-0 kubenswrapper[6932]: W0319 11:52:55.755382 6932 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 11:52:55.760660 master-0 kubenswrapper[6932]: W0319 11:52:55.755387 6932 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 11:52:55.760660 master-0 kubenswrapper[6932]: W0319 11:52:55.755393 6932 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 11:52:55.760660 master-0 kubenswrapper[6932]: W0319 11:52:55.755399 6932 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 11:52:55.760660 master-0 kubenswrapper[6932]: W0319 11:52:55.755404 6932 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 11:52:55.760660 master-0 kubenswrapper[6932]: W0319 11:52:55.755409 6932 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 11:52:55.760660 master-0 kubenswrapper[6932]: W0319 11:52:55.755416 6932 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 11:52:55.760660 master-0 kubenswrapper[6932]: W0319 11:52:55.755421 6932 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 11:52:55.760660 master-0 kubenswrapper[6932]: W0319 11:52:55.755426 6932 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 11:52:55.761147 master-0 kubenswrapper[6932]: W0319 11:52:55.755431 6932 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 11:52:55.761147 master-0 kubenswrapper[6932]: W0319 11:52:55.755437 6932 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 11:52:55.761147 master-0 kubenswrapper[6932]: W0319 11:52:55.755442 6932 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 11:52:55.761147 master-0 kubenswrapper[6932]: W0319 11:52:55.755447 6932 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 11:52:55.761147 master-0 kubenswrapper[6932]: I0319 11:52:55.755464 6932 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 11:52:55.762113 master-0 kubenswrapper[6932]: I0319 11:52:55.762080 6932 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 19 11:52:55.762113 master-0 kubenswrapper[6932]: I0319 11:52:55.762107 6932 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 19 11:52:55.762197 master-0 kubenswrapper[6932]: W0319 11:52:55.762167 6932 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 11:52:55.762197 master-0 kubenswrapper[6932]: W0319 11:52:55.762176 6932 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 11:52:55.762197 master-0 kubenswrapper[6932]: W0319 11:52:55.762182 6932 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 11:52:55.762197 master-0 kubenswrapper[6932]: W0319 11:52:55.762188 6932 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 11:52:55.762197 master-0 kubenswrapper[6932]: W0319 11:52:55.762192 6932 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 11:52:55.762197 master-0 kubenswrapper[6932]: W0319 11:52:55.762197 6932 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 11:52:55.762197 master-0 kubenswrapper[6932]: W0319 11:52:55.762203 6932 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 11:52:55.762356 master-0 kubenswrapper[6932]: W0319 11:52:55.762208 6932 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 11:52:55.762356 master-0 kubenswrapper[6932]: W0319 11:52:55.762213 6932 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 11:52:55.762356 master-0 kubenswrapper[6932]: W0319 11:52:55.762218 6932 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 11:52:55.762356 master-0 kubenswrapper[6932]: W0319 11:52:55.762222 6932 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 11:52:55.762356 master-0 kubenswrapper[6932]: W0319 11:52:55.762226 6932 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 11:52:55.762356 master-0 kubenswrapper[6932]: W0319 11:52:55.762230 6932 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 11:52:55.762356 master-0 kubenswrapper[6932]: W0319 11:52:55.762234 6932 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 11:52:55.762356 master-0 kubenswrapper[6932]: W0319 11:52:55.762239 6932 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 11:52:55.762356 master-0 kubenswrapper[6932]: W0319 11:52:55.762244 6932 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 11:52:55.762356 master-0 kubenswrapper[6932]: W0319 11:52:55.762249 6932 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 11:52:55.762356 master-0 kubenswrapper[6932]: W0319 11:52:55.762254 6932 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 11:52:55.762356 master-0 kubenswrapper[6932]: W0319 11:52:55.762260 6932 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 11:52:55.762356 master-0 kubenswrapper[6932]: W0319 11:52:55.762265 6932 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 11:52:55.762356 master-0 kubenswrapper[6932]: W0319 11:52:55.762270 6932 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 11:52:55.762356 master-0 kubenswrapper[6932]: W0319 11:52:55.762274 6932 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 11:52:55.762356 master-0 kubenswrapper[6932]: W0319 11:52:55.762278 6932 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 11:52:55.762356 master-0 kubenswrapper[6932]: W0319 11:52:55.762283 6932 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 11:52:55.762356 master-0 kubenswrapper[6932]: W0319 11:52:55.762287 6932 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 11:52:55.762356 master-0 kubenswrapper[6932]: W0319 11:52:55.762291 6932 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 11:52:55.762356 master-0 kubenswrapper[6932]: W0319 11:52:55.762295 6932 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 11:52:55.763098 master-0 kubenswrapper[6932]: W0319 11:52:55.762298 6932 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 11:52:55.763098 master-0 kubenswrapper[6932]: W0319 11:52:55.762302 6932 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 11:52:55.763098 master-0 kubenswrapper[6932]: W0319 11:52:55.762306 6932 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 11:52:55.763098 master-0 kubenswrapper[6932]: W0319 11:52:55.762309 6932 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 11:52:55.763098 master-0 kubenswrapper[6932]: W0319 11:52:55.762313 6932 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 11:52:55.763098 master-0 kubenswrapper[6932]: W0319 11:52:55.762316 6932 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 11:52:55.763098 master-0 kubenswrapper[6932]: W0319 11:52:55.762320 6932 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 11:52:55.763098 master-0 kubenswrapper[6932]: W0319 11:52:55.762324 6932 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 11:52:55.763098 master-0 kubenswrapper[6932]: W0319 11:52:55.762328 6932 feature_gate.go:330] unrecognized feature gate: Example Mar 19 11:52:55.763098 master-0 kubenswrapper[6932]: W0319 11:52:55.762333 6932 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 11:52:55.763098 master-0 kubenswrapper[6932]: W0319 11:52:55.762338 6932 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 11:52:55.763098 master-0 kubenswrapper[6932]: W0319 11:52:55.762342 6932 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 11:52:55.763098 master-0 kubenswrapper[6932]: W0319 11:52:55.762347 6932 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 11:52:55.763098 master-0 kubenswrapper[6932]: W0319 11:52:55.762353 6932 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 11:52:55.763098 master-0 kubenswrapper[6932]: W0319 11:52:55.762359 6932 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 11:52:55.763098 master-0 kubenswrapper[6932]: W0319 11:52:55.762365 6932 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 11:52:55.763098 master-0 kubenswrapper[6932]: W0319 11:52:55.762372 6932 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 11:52:55.763098 master-0 kubenswrapper[6932]: W0319 11:52:55.762378 6932 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 11:52:55.763098 master-0 kubenswrapper[6932]: W0319 11:52:55.762384 6932 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 11:52:55.763098 master-0 kubenswrapper[6932]: W0319 11:52:55.762390 6932 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 11:52:55.763624 master-0 kubenswrapper[6932]: W0319 11:52:55.762395 6932 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 11:52:55.763624 master-0 kubenswrapper[6932]: W0319 11:52:55.762400 6932 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 11:52:55.763624 master-0 kubenswrapper[6932]: W0319 11:52:55.762404 6932 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 11:52:55.763624 master-0 kubenswrapper[6932]: W0319 11:52:55.762408 6932 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 11:52:55.763624 master-0 kubenswrapper[6932]: W0319 11:52:55.762412 6932 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 11:52:55.763624 master-0 kubenswrapper[6932]: W0319 11:52:55.762416 6932 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 11:52:55.763624 master-0 kubenswrapper[6932]: W0319 11:52:55.762422 6932 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 11:52:55.763624 master-0 kubenswrapper[6932]: W0319 11:52:55.762426 6932 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 11:52:55.763624 master-0 kubenswrapper[6932]: W0319 11:52:55.762432 6932 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 11:52:55.763624 master-0 kubenswrapper[6932]: W0319 11:52:55.762438 6932 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 11:52:55.763624 master-0 kubenswrapper[6932]: W0319 11:52:55.762443 6932 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 11:52:55.763624 master-0 kubenswrapper[6932]: W0319 11:52:55.762449 6932 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 11:52:55.763624 master-0 kubenswrapper[6932]: W0319 11:52:55.762453 6932 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 11:52:55.763624 master-0 kubenswrapper[6932]: W0319 11:52:55.762458 6932 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 11:52:55.763624 master-0 kubenswrapper[6932]: W0319 11:52:55.762463 6932 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 11:52:55.763624 master-0 kubenswrapper[6932]: W0319 11:52:55.762468 6932 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 11:52:55.763624 master-0 kubenswrapper[6932]: W0319 11:52:55.762472 6932 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 11:52:55.763624 master-0 kubenswrapper[6932]: W0319 11:52:55.762477 6932 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 11:52:55.763624 master-0 kubenswrapper[6932]: W0319 11:52:55.762482 6932 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 11:52:55.765254 master-0 kubenswrapper[6932]: W0319 11:52:55.762486 6932 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 11:52:55.765254 master-0 kubenswrapper[6932]: W0319 11:52:55.762491 6932 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 11:52:55.765254 master-0 kubenswrapper[6932]: W0319 11:52:55.762495 6932 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 11:52:55.765254 master-0 kubenswrapper[6932]: W0319 11:52:55.762500 6932 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 11:52:55.765254 master-0 kubenswrapper[6932]: W0319 11:52:55.762504 6932 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 11:52:55.765254 master-0 kubenswrapper[6932]: W0319 11:52:55.762509 6932 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 11:52:55.765254 master-0 kubenswrapper[6932]: I0319 11:52:55.762517 6932 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 11:52:55.765254 master-0 kubenswrapper[6932]: W0319 11:52:55.762653 6932 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 11:52:55.765254 master-0 kubenswrapper[6932]: W0319 11:52:55.762661 6932 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 11:52:55.765254 master-0 kubenswrapper[6932]: W0319 11:52:55.762666 6932 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 11:52:55.765254 master-0 kubenswrapper[6932]: W0319 11:52:55.762672 6932 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 11:52:55.765254 master-0 kubenswrapper[6932]: W0319 11:52:55.762677 6932 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 11:52:55.765254 master-0 kubenswrapper[6932]: W0319 11:52:55.762682 6932 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 11:52:55.765254 master-0 kubenswrapper[6932]: W0319 11:52:55.762687 6932 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 11:52:55.765254 master-0 kubenswrapper[6932]: W0319 11:52:55.762692 6932 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 11:52:55.765629 master-0 kubenswrapper[6932]: W0319 11:52:55.762697 6932 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 11:52:55.765629 master-0 kubenswrapper[6932]: W0319 11:52:55.762702 6932 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 11:52:55.765629 master-0 kubenswrapper[6932]: W0319 11:52:55.762707 6932 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 11:52:55.765629 master-0 kubenswrapper[6932]: W0319 11:52:55.762711 6932 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 11:52:55.765629 master-0 kubenswrapper[6932]: W0319 11:52:55.762716 6932 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 11:52:55.765629 master-0 kubenswrapper[6932]: W0319 11:52:55.762720 6932 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 11:52:55.765629 master-0 kubenswrapper[6932]: W0319 11:52:55.762743 6932 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 11:52:55.765629 master-0 kubenswrapper[6932]: W0319 11:52:55.762748 6932 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 11:52:55.765629 master-0 kubenswrapper[6932]: W0319 11:52:55.762752 6932 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 11:52:55.765629 master-0 kubenswrapper[6932]: W0319 11:52:55.762757 6932 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 11:52:55.765629 master-0 kubenswrapper[6932]: W0319 11:52:55.762762 6932 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 11:52:55.765629 master-0 kubenswrapper[6932]: W0319 11:52:55.762766 6932 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 11:52:55.765629 master-0 kubenswrapper[6932]: W0319 11:52:55.762772 6932 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 11:52:55.765629 master-0 kubenswrapper[6932]: W0319 11:52:55.762777 6932 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 11:52:55.765629 master-0 kubenswrapper[6932]: W0319 11:52:55.762781 6932 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 11:52:55.765629 master-0 kubenswrapper[6932]: W0319 11:52:55.762787 6932 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 11:52:55.765629 master-0 kubenswrapper[6932]: W0319 11:52:55.762791 6932 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 11:52:55.765629 master-0 kubenswrapper[6932]: W0319 11:52:55.762796 6932 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 11:52:55.765629 master-0 kubenswrapper[6932]: W0319 11:52:55.762801 6932 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 11:52:55.765629 master-0 kubenswrapper[6932]: W0319 11:52:55.762805 6932 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 11:52:55.766298 master-0 kubenswrapper[6932]: W0319 11:52:55.762810 6932 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 11:52:55.766298 master-0 kubenswrapper[6932]: W0319 11:52:55.762815 6932 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 11:52:55.766298 master-0 kubenswrapper[6932]: W0319 11:52:55.762819 6932 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 11:52:55.766298 master-0 kubenswrapper[6932]: W0319 11:52:55.762824 6932 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 11:52:55.766298 master-0 kubenswrapper[6932]: W0319 11:52:55.762828 6932 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 11:52:55.766298 master-0 kubenswrapper[6932]: W0319 11:52:55.762833 6932 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 11:52:55.766298 master-0 kubenswrapper[6932]: W0319 11:52:55.762837 6932 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 11:52:55.766298 master-0 kubenswrapper[6932]: W0319 11:52:55.762842 6932 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 11:52:55.766298 master-0 kubenswrapper[6932]: W0319 11:52:55.762846 6932 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 11:52:55.766298 master-0 kubenswrapper[6932]: W0319 11:52:55.762853 6932 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 11:52:55.766298 master-0 kubenswrapper[6932]: W0319 11:52:55.762859 6932 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 11:52:55.766298 master-0 kubenswrapper[6932]: W0319 11:52:55.762865 6932 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 11:52:55.766298 master-0 kubenswrapper[6932]: W0319 11:52:55.762870 6932 feature_gate.go:330] unrecognized feature gate: Example Mar 19 11:52:55.766298 master-0 kubenswrapper[6932]: W0319 11:52:55.762876 6932 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 11:52:55.766298 master-0 kubenswrapper[6932]: W0319 11:52:55.762881 6932 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 11:52:55.766298 master-0 kubenswrapper[6932]: W0319 11:52:55.762885 6932 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 11:52:55.766298 master-0 kubenswrapper[6932]: W0319 11:52:55.762891 6932 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 11:52:55.766298 master-0 kubenswrapper[6932]: W0319 11:52:55.762895 6932 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 11:52:55.766298 master-0 kubenswrapper[6932]: W0319 11:52:55.762902 6932 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 11:52:55.766741 master-0 kubenswrapper[6932]: W0319 11:52:55.762907 6932 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 11:52:55.766741 master-0 kubenswrapper[6932]: W0319 11:52:55.762913 6932 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 11:52:55.766741 master-0 kubenswrapper[6932]: W0319 11:52:55.762918 6932 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 11:52:55.766741 master-0 kubenswrapper[6932]: W0319 11:52:55.762923 6932 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 11:52:55.766741 master-0 kubenswrapper[6932]: W0319 11:52:55.762928 6932 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 11:52:55.766741 master-0 kubenswrapper[6932]: W0319 11:52:55.762932 6932 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 11:52:55.766741 master-0 kubenswrapper[6932]: W0319 11:52:55.762937 6932 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 11:52:55.766741 master-0 kubenswrapper[6932]: W0319 11:52:55.762942 6932 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 11:52:55.766741 master-0 kubenswrapper[6932]: W0319 11:52:55.762946 6932 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 11:52:55.766741 master-0 kubenswrapper[6932]: W0319 11:52:55.762951 6932 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 11:52:55.766741 master-0 kubenswrapper[6932]: W0319 11:52:55.762955 6932 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 11:52:55.766741 master-0 kubenswrapper[6932]: W0319 11:52:55.762960 6932 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 11:52:55.766741 master-0 kubenswrapper[6932]: W0319 11:52:55.762965 6932 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 11:52:55.766741 master-0 kubenswrapper[6932]: W0319 11:52:55.762970 6932 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 11:52:55.766741 master-0 kubenswrapper[6932]: W0319 11:52:55.762975 6932 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 11:52:55.766741 master-0 kubenswrapper[6932]: W0319 11:52:55.762979 6932 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 11:52:55.766741 master-0 kubenswrapper[6932]: W0319 11:52:55.762984 6932 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 11:52:55.766741 master-0 kubenswrapper[6932]: W0319 11:52:55.762988 6932 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 11:52:55.766741 master-0 kubenswrapper[6932]: W0319 11:52:55.762993 6932 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 11:52:55.766741 master-0 kubenswrapper[6932]: W0319 11:52:55.762999 6932 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 11:52:55.767203 master-0 kubenswrapper[6932]: W0319 11:52:55.763005 6932 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 11:52:55.767203 master-0 kubenswrapper[6932]: W0319 11:52:55.763010 6932 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 11:52:55.767203 master-0 kubenswrapper[6932]: W0319 11:52:55.763015 6932 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 11:52:55.767203 master-0 kubenswrapper[6932]: W0319 11:52:55.763022 6932 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 11:52:55.767203 master-0 kubenswrapper[6932]: W0319 11:52:55.763028 6932 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 11:52:55.767203 master-0 kubenswrapper[6932]: I0319 11:52:55.763035 6932 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 11:52:55.767203 master-0 kubenswrapper[6932]: I0319 11:52:55.763149 6932 server.go:940] "Client rotation is on, will bootstrap in background" Mar 19 11:52:55.767203 master-0 kubenswrapper[6932]: I0319 11:52:55.766101 6932 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Mar 19 11:52:55.767203 master-0 kubenswrapper[6932]: I0319 11:52:55.766215 6932 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 19 11:52:55.767203 master-0 kubenswrapper[6932]: I0319 11:52:55.766549 6932 server.go:997] "Starting client certificate rotation" Mar 19 11:52:55.767203 master-0 kubenswrapper[6932]: I0319 11:52:55.766561 6932 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 19 11:52:55.767203 master-0 kubenswrapper[6932]: I0319 11:52:55.766679 6932 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-20 11:43:03 +0000 UTC, rotation deadline is 2026-03-20 07:39:14.976299677 +0000 UTC Mar 19 11:52:55.768110 master-0 kubenswrapper[6932]: I0319 11:52:55.766711 6932 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 19h46m19.209590427s for next certificate rotation Mar 19 11:52:55.768110 master-0 kubenswrapper[6932]: I0319 11:52:55.767417 6932 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 11:52:55.771109 master-0 kubenswrapper[6932]: I0319 11:52:55.769960 6932 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 11:52:55.780002 master-0 kubenswrapper[6932]: I0319 11:52:55.779763 6932 log.go:25] "Validated CRI v1 runtime API" Mar 19 11:52:55.781588 master-0 kubenswrapper[6932]: I0319 11:52:55.781505 6932 log.go:25] "Validated CRI v1 image API" Mar 19 11:52:55.782652 master-0 kubenswrapper[6932]: I0319 11:52:55.782282 6932 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 19 11:52:55.787002 master-0 kubenswrapper[6932]: I0319 11:52:55.786948 6932 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/vda2 84bbc972-b2a6-48d9-8e4d-c9ff50fad0b0:/dev/vda3 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Mar 19 11:52:55.787705 master-0 kubenswrapper[6932]: I0319 11:52:55.786995 6932 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/159515e88e5e657c5dc1a45dfc38f542a76bac41085e0be14941a32b19e214ef/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/159515e88e5e657c5dc1a45dfc38f542a76bac41085e0be14941a32b19e214ef/userdata/shm major:0 minor:276 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2151069adcf5d6126fb57190dc2ec941b6dc342421174da0283b995f56e1641b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2151069adcf5d6126fb57190dc2ec941b6dc342421174da0283b995f56e1641b/userdata/shm major:0 minor:129 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/23f47642dd95b352c86bf3516967ac9ae86ccfd441d6afb36a3e2d4a5c622a4a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/23f47642dd95b352c86bf3516967ac9ae86ccfd441d6afb36a3e2d4a5c622a4a/userdata/shm major:0 minor:144 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/25dfae9bb0843173d90c844dacf16818cb3d6d61cb972bb6cd1177b47a320778/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/25dfae9bb0843173d90c844dacf16818cb3d6d61cb972bb6cd1177b47a320778/userdata/shm major:0 minor:262 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/27419838ac6bb228f6151c74e466e550ee30c7ce1c14772f63c150dcd524d6e7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/27419838ac6bb228f6151c74e466e550ee30c7ce1c14772f63c150dcd524d6e7/userdata/shm major:0 minor:279 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/28a32f59656edf5ebf4428eb19343f79c79bdc3e9a5ed63a5fa7185ccacbd30e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/28a32f59656edf5ebf4428eb19343f79c79bdc3e9a5ed63a5fa7185ccacbd30e/userdata/shm major:0 minor:282 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/34e072f72c93d6369874c6beffaed27fe7a497ddbd4993eb86f92f576e79b6ab/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/34e072f72c93d6369874c6beffaed27fe7a497ddbd4993eb86f92f576e79b6ab/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3ca871a2e4c187593092b1e6a4a9637d7435e4628b01bcadfea7c6a9560eeb21/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3ca871a2e4c187593092b1e6a4a9637d7435e4628b01bcadfea7c6a9560eeb21/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5359d955256489cf75babf6cd7e374f24ca5753414f295ec115bac354fbe37e1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5359d955256489cf75babf6cd7e374f24ca5753414f295ec115bac354fbe37e1/userdata/shm major:0 minor:249 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/67cce31157aba8cd32c19fdb97a814cf6764c07048a060294135b6ce20e85f0e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/67cce31157aba8cd32c19fdb97a814cf6764c07048a060294135b6ce20e85f0e/userdata/shm major:0 minor:275 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7444b7503d7740b7e0cd43f84f6cce1196456b0e8df5c1dc67a1f73e2797cf61/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7444b7503d7740b7e0cd43f84f6cce1196456b0e8df5c1dc67a1f73e2797cf61/userdata/shm major:0 minor:240 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/74d7d2df3602ec247c94c7641e1ca1523b5ae6b42624ca797fbd2b6225dfbfa4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/74d7d2df3602ec247c94c7641e1ca1523b5ae6b42624ca797fbd2b6225dfbfa4/userdata/shm major:0 minor:260 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/857137fd3aca8af8c5c19bcaeff329a322e9a54b7ff7f19d360c176d0e68cab5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/857137fd3aca8af8c5c19bcaeff329a322e9a54b7ff7f19d360c176d0e68cab5/userdata/shm major:0 minor:254 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8797022c969de9642db09ed804cdf4aed14c8648d4f8b5b9c9f88a55664979e8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8797022c969de9642db09ed804cdf4aed14c8648d4f8b5b9c9f88a55664979e8/userdata/shm major:0 minor:280 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/88f5dfffda4adf62f6636e4646d2c851172ef321255a628934b6453ee67c8f03/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/88f5dfffda4adf62f6636e4646d2c851172ef321255a628934b6453ee67c8f03/userdata/shm major:0 minor:128 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a387a6fb603981d31a2529e0731ac72c41f84be90202777248f07296f1eb9d6b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a387a6fb603981d31a2529e0731ac72c41f84be90202777248f07296f1eb9d6b/userdata/shm major:0 minor:99 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b54d0875a5c74a95cdb12684066d437927027dd749aa30fdd27e9a88de808b47/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b54d0875a5c74a95cdb12684066d437927027dd749aa30fdd27e9a88de808b47/userdata/shm major:0 minor:236 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c21d5cdcf33dc5445d398db5efae2e61668498b313fd2a8200f2011b9857d1d4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c21d5cdcf33dc5445d398db5efae2e61668498b313fd2a8200f2011b9857d1d4/userdata/shm major:0 minor:50 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c7fce19a33a5dd46ce06e3ec2001f8aae0d2c521be7c2647e59448b0833408c9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c7fce19a33a5dd46ce06e3ec2001f8aae0d2c521be7c2647e59448b0833408c9/userdata/shm major:0 minor:46 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cc43d8901b61b03f0bff74bea5349f358784d720e2984f56ccc961dc3f630856/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cc43d8901b61b03f0bff74bea5349f358784d720e2984f56ccc961dc3f630856/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d016f849de27f64d027bbd73120eb329b0253680086fad1c1a5d1d59daba5c27/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d016f849de27f64d027bbd73120eb329b0253680086fad1c1a5d1d59daba5c27/userdata/shm major:0 minor:111 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d7889469ef63ab146c50d169dc4f57ff3c6e05bfe52d83c88832208089809932/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d7889469ef63ab146c50d169dc4f57ff3c6e05bfe52d83c88832208089809932/userdata/shm major:0 minor:119 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d9bf0e017da39714ca0d58a2ba0c46cd89a43ae7f317d13dbb6e31831feeb576/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d9bf0e017da39714ca0d58a2ba0c46cd89a43ae7f317d13dbb6e31831feeb576/userdata/shm major:0 minor:271 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ee84c91e209b8d15a57102d97b9b923b7a0a0247657f697f48616f38ce178b0b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ee84c91e209b8d15a57102d97b9b923b7a0a0247657f697f48616f38ce178b0b/userdata/shm major:0 minor:221 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/09a22c25-6073-4b1a-a029-928452ef37db/volumes/kubernetes.io~projected/kube-api-access-xx4wk:{mountpoint:/var/lib/kubelet/pods/09a22c25-6073-4b1a-a029-928452ef37db/volumes/kubernetes.io~projected/kube-api-access-xx4wk major:0 minor:105 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/163d6a3d-0080-4122-bb7a-17f6e63f00f0/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/163d6a3d-0080-4122-bb7a-17f6e63f00f0/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:243 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/163d6a3d-0080-4122-bb7a-17f6e63f00f0/volumes/kubernetes.io~projected/kube-api-access-m7tc5:{mountpoint:/var/lib/kubelet/pods/163d6a3d-0080-4122-bb7a-17f6e63f00f0/volumes/kubernetes.io~projected/kube-api-access-m7tc5 major:0 minor:269 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2292109e-92a9-4286-858e-dcd2ac083c43/volumes/kubernetes.io~projected/kube-api-access-8rt57:{mountpoint:/var/lib/kubelet/pods/2292109e-92a9-4286-858e-dcd2ac083c43/volumes/kubernetes.io~projected/kube-api-access-8rt57 major:0 minor:218 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/22e10648-af7c-409e-b947-570e7d807e05/volumes/kubernetes.io~projected/kube-api-access-wls49:{mountpoint:/var/lib/kubelet/pods/22e10648-af7c-409e-b947-570e7d807e05/volumes/kubernetes.io~projected/kube-api-access-wls49 major:0 minor:220 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3053504d-0734-4def-b639-0f5cc2178185/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/3053504d-0734-4def-b639-0f5cc2178185/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3053504d-0734-4def-b639-0f5cc2178185/volumes/kubernetes.io~projected/kube-api-access-2bb2x:{mountpoint:/var/lib/kubelet/pods/3053504d-0734-4def-b639-0f5cc2178185/volumes/kubernetes.io~projected/kube-api-access-2bb2x major:0 minor:127 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3053504d-0734-4def-b639-0f5cc2178185/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/3053504d-0734-4def-b639-0f5cc2178185/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:126 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/333047c4-aeca-410e-9393-ca4e74366921/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/333047c4-aeca-410e-9393-ca4e74366921/volumes/kubernetes.io~projected/kube-api-access major:0 minor:98 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/39d3ac31-9259-454b-8e1c-e23024f8f2b2/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/39d3ac31-9259-454b-8e1c-e23024f8f2b2/volumes/kubernetes.io~projected/kube-api-access major:0 minor:259 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/39d3ac31-9259-454b-8e1c-e23024f8f2b2/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/39d3ac31-9259-454b-8e1c-e23024f8f2b2/volumes/kubernetes.io~secret/serving-cert major:0 minor:225 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3c3b0d24-ce5e-49c3-a546-874356f75dc6/volumes/kubernetes.io~projected/kube-api-access-pngsr:{mountpoint:/var/lib/kubelet/pods/3c3b0d24-ce5e-49c3-a546-874356f75dc6/volumes/kubernetes.io~projected/kube-api-access-pngsr major:0 minor:94 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3c3b0d24-ce5e-49c3-a546-874356f75dc6/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/3c3b0d24-ce5e-49c3-a546-874356f75dc6/volumes/kubernetes.io~secret/metrics-tls major:0 minor:43 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4e2c195f-e97d-4cac-81fc-2d5c551d1c30/volumes/kubernetes.io~projected/kube-api-access-kgz7q:{mountpoint:/var/lib/kubelet/pods/4e2c195f-e97d-4cac-81fc-2d5c551d1c30/volumes/kubernetes.io~projected/kube-api-access-kgz7q major:0 minor:242 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6611e325-6152-480c-9c2c-1b503e49ccd2/volumes/kubernetes.io~projected/kube-api-access-4p4hg:{mountpoint:/var/lib/kubelet/pods/6611e325-6152-480c-9c2c-1b503e49ccd2/volumes/kubernetes.io~projected/kube-api-access-4p4hg major:0 minor:244 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6611e325-6152-480c-9c2c-1b503e49ccd2/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/6611e325-6152-480c-9c2c-1b503e49ccd2/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:226 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/66f88242-8b0b-4790-bbb6-445c19b34ee7/volumes/kubernetes.io~projected/kube-api-access-p5fnx:{mountpoint:/var/lib/kubelet/pods/66f88242-8b0b-4790-bbb6-445c19b34ee7/volumes/kubernetes.io~projected/kube-api-access-p5fnx major:0 minor:245 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/66f88242-8b0b-4790-bbb6-445c19b34ee7/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/66f88242-8b0b-4790-bbb6-445c19b34ee7/volumes/kubernetes.io~secret/serving-cert major:0 minor:231 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9/volumes/kubernetes.io~projected/kube-api-access-79qrb:{mountpoint:/var/lib/kubelet/pods/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9/volumes/kubernetes.io~projected/kube-api-access-79qrb major:0 minor:233 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/716c2176-50f9-4c4f-af0e-4c7973457df2/volumes/kubernetes.io~projected/kube-api-access-m8bmw:{mountpoint:/var/lib/kubelet/pods/716c2176-50f9-4c4f-af0e-4c7973457df2/volumes/kubernetes.io~projected/kube-api-access-m8bmw major:0 minor:217 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/732989c5-1b89-46f0-9917-b68613f7f005/volumes/kubernetes.io~projected/kube-api-access-bfvz6:{mountpoint:/var/lib/kubelet/pods/732989c5-1b89-46f0-9917-b68613f7f005/volumes/kubernetes.io~projected/kube-api-access-bfvz6 major:0 minor:216 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/732989c5-1b89-46f0-9917-b68613f7f005/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/732989c5-1b89-46f0-9917-b68613f7f005/volumes/kubernetes.io~secret/serving-cert major:0 minor:213 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8438d015-106b-4aed-ae12-dda781ce51fc/volumes/kubernetes.io~projected/kube-api-access-cqr6w:{mountpoint:/var/lib/kubelet/pods/8438d015-106b-4aed-ae12-dda781ce51fc/volumes/kubernetes.io~projected/kube-api-access-cqr6w major:0 minor:148 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8438d015-106b-4aed-ae12-dda781ce51fc/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/8438d015-106b-4aed-ae12-dda781ce51fc/volumes/kubernetes.io~secret/webhook-cert major:0 minor:140 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7/volumes/kubernetes.io~projected/kube-api-access-7spvn:{mountpoint:/var/lib/kubelet/pods/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7/volumes/kubernetes.io~projected/kube-api-access-7spvn major:0 minor:246 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76/volumes/kubernetes.io~projected/kube-api-access-dnl28:{mountpoint:/var/lib/kubelet/pods/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76/volumes/kubernetes.io~projected/kube-api-access-dnl28 major:0 minor:252 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76/volumes/kubernetes.io~secret/etcd-client major:0 minor:230 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76/volumes/kubernetes.io~secret/serving-cert major:0 minor:227 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9b61ea14-a7ea-49f3-9df4-5655765ddf7c/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/9b61ea14-a7ea-49f3-9df4-5655765ddf7c/volumes/kubernetes.io~projected/kube-api-access major:0 minor:253 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9b61ea14-a7ea-49f3-9df4-5655765ddf7c/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/9b61ea14-a7ea-49f3-9df4-5655765ddf7c/volumes/kubernetes.io~secret/serving-cert major:0 minor:229 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a3ceeece-bee9-4fcb-8517-95ebce38e223/volumes/kubernetes.io~projected/kube-api-access-zrgqb:{mountpoint:/var/lib/kubelet/pods/a3ceeece-bee9-4fcb-8517-95ebce38e223/volumes/kubernetes.io~projected/kube-api-access-zrgqb major:0 minor:255 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a3ceeece-bee9-4fcb-8517-95ebce38e223/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a3ceeece-bee9-4fcb-8517-95ebce38e223/volumes/kubernetes.io~secret/serving-cert major:0 minor:223 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:235 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1/volumes/kubernetes.io~projected/kube-api-access-46m89:{mountpoint:/var/lib/kubelet/pods/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1/volumes/kubernetes.io~projected/kube-api-access-46m89 major:0 minor:239 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/aaaaf539-bf61-44d7-8d47-97535b7aa1ba/volumes/kubernetes.io~projected/kube-api-access-7nfnb:{mountpoint:/var/lib/kubelet/pods/aaaaf539-bf61-44d7-8d47-97535b7aa1ba/volumes/kubernetes.io~projected/kube-api-access-7nfnb major:0 minor:268 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d/volumes/kubernetes.io~projected/kube-api-access-qql5t:{mountpoint:/var/lib/kubelet/pods/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d/volumes/kubernetes.io~projected/kube-api-access-qql5t major:0 minor:234 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d/volumes/kubernetes.io~secret/serving-cert major:0 minor:224 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b3de8a1b-a5be-414f-86e8-738e16c8bc97/volumes/kubernetes.io~projected/kube-api-access-nlr9q:{mountpoint:/var/lib/kubelet/pods/b3de8a1b-a5be-414f-86e8-738e16c8bc97/volumes/kubernetes.io~projected/kube-api-access-nlr9q major:0 minor:270 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a/volumes/kubernetes.io~projected/kube-api-access-lqcvx:{mountpoint:/var/lib/kubelet/pods/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a/volumes/kubernetes.io~projected/kube-api-access-lqcvx major:0 minor:118 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cf08ab4f-c203-4c16-9826-8cc049f4af31/volumes/kubernetes.io~projected/kube-api-access-lkm97:{mountpoint:/var/lib/kubelet/pods/cf08ab4f-c203-4c16-9826-8cc049f4af31/volumes/kubernetes.io~projected/kube-api-access-lkm97 major:0 minor:214 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/daf4dbb6-5a0a-4c92-a930-479a7330ace1/volumes/kubernetes.io~projected/kube-api-access-72jlb:{mountpoint:/var/lib/kubelet/pods/daf4dbb6-5a0a-4c92-a930-479a7330ace1/volumes/kubernetes.io~projected/kube-api-access-72jlb major:0 minor:125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/daf4dbb6-5a0a-4c92-a930-479a7330ace1/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/daf4dbb6-5a0a-4c92-a930-479a7330ace1/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dbcbba74-ac53-4724-a217-4d9b85e7c1db/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/dbcbba74-ac53-4724-a217-4d9b85e7c1db/volumes/kubernetes.io~projected/kube-api-access major:0 minor:238 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dbcbba74-ac53-4724-a217-4d9b85e7c1db/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/dbcbba74-ac53-4724-a217-4d9b85e7c1db/volumes/kubernetes.io~secret/serving-cert major:0 minor:232 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e5078f17-bc65-460f-9f18-8c506db6840b/volumes/kubernetes.io~projected/kube-api-access-s5rm4:{mountpoint:/var/lib/kubelet/pods/e5078f17-bc65-460f-9f18-8c506db6840b/volumes/kubernetes.io~projected/kube-api-access-s5rm4 major:0 minor:215 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf/volumes/kubernetes.io~projected/kube-api-access-nds54:{mountpoint:/var/lib/kubelet/pods/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf/volumes/kubernetes.io~projected/kube-api-access-nds54 major:0 minor:251 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf/volumes/kubernetes.io~secret/serving-cert major:0 minor:228 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f29b11ce-60e0-46b3-8d28-eea3452513cd/volumes/kubernetes.io~projected/kube-api-access-bgs4l:{mountpoint:/var/lib/kubelet/pods/f29b11ce-60e0-46b3-8d28-eea3452513cd/volumes/kubernetes.io~projected/kube-api-access-bgs4l major:0 minor:123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f5d73fef-1414-4b29-97ea-42e1c0b1ef18/volumes/kubernetes.io~projected/kube-api-access-v27lg:{mountpoint:/var/lib/kubelet/pods/f5d73fef-1414-4b29-97ea-42e1c0b1ef18/volumes/kubernetes.io~projected/kube-api-access-v27lg major:0 minor:219 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f5d73fef-1414-4b29-97ea-42e1c0b1ef18/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/f5d73fef-1414-4b29-97ea-42e1c0b1ef18/volumes/kubernetes.io~secret/serving-cert major:0 minor:209 fsType:tmpfs blockSize:0} overlay_0-101:{mountpoint:/var/lib/containers/storage/overlay/bcf70faa811c2d8894b8c7f8cc8e5aaf25df94bbfa45659fa429e361c83870d9/merged major:0 minor:101 fsType:overlay blockSize:0} overlay_0-103:{mountpoint:/var/lib/containers/storage/overlay/6c5618531b1c799b895390ee5440a94042807ff01e7eea0c54caf8b1a20dc06e/merged major:0 minor:103 fsType:overlay blockSize:0} overlay_0-108:{mountpoint:/var/lib/containers/storage/overlay/43ba061e78b7b4d8b629c33271751e75307760acb53a794fd036ca5701370f7e/merged major:0 minor:108 fsType:overlay blockSize:0} overlay_0-116:{mountpoint:/var/lib/containers/storage/overlay/5c39a2c50e7698d736a8f7aa9d82d7b1c767bb53a2e526abb9bcf7dec87710fd/merged major:0 minor:116 fsType:overlay blockSize:0} overlay_0-121:{mountpoint:/var/lib/containers/storage/overlay/5c645afb383fc8074a4c59ffdc9c4a81db7d9b56acb3fe4b583812122ceabd9e/merged major:0 minor:121 fsType:overlay blockSize:0} overlay_0-132:{mountpoint:/var/lib/containers/storage/overlay/574ab10504f42965944c2de9b826f2a3c48da25940ae1a5029f5de0fbd30c41d/merged major:0 minor:132 fsType:overlay blockSize:0} overlay_0-134:{mountpoint:/var/lib/containers/storage/overlay/4d7148e2f461ca7b5d085d69f3008c934c2194c8c6e27eb41226c932eda41d36/merged major:0 minor:134 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/d744d7ba91f54f87fca24f93497c9738c3f4d45695c17724794ab01f2e913065/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-138:{mountpoint:/var/lib/containers/storage/overlay/d30e06da4d0f0e414aaa9ddee59ab19514b095d25b03ae7570b4cc67b6c8a3f8/merged major:0 minor:138 fsType:overlay blockSize:0} overlay_0-150:{mountpoint:/var/lib/containers/storage/overlay/f7cdea35bd3a0c1d454076ac4e45750456617c7c920da7ad2dc2f8c7c3affe44/merged major:0 minor:150 fsType:overlay blockSize:0} overlay_0-152:{mountpoint:/var/lib/containers/storage/overlay/9d96aa4936e3fb6cd07bb60fdf9cd1bc30ff6d7c9a37aef6f94cede82c5e1d5f/merged major:0 minor:152 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/cabcb006068adce359172b64cf71dc757be1889693be5a6acb7042d86531dd99/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-156:{mountpoint:/var/lib/containers/storage/overlay/eea2ab4fa3e63f8fb62aa0b968d5e9f4e75934f9402a1210152af29daf22eaed/merged major:0 minor:156 fsType:overlay blockSize:0} overlay_0-160:{mountpoint:/var/lib/containers/storage/overlay/5b0a5201dc3d3c02ccb7a9a1836ec7dd089c1ac7610ff574387748c222d05792/merged major:0 minor:160 fsType:overlay blockSize:0} overlay_0-162:{mountpoint:/var/lib/containers/storage/overlay/5930e0b4c225cf6eac74dc8f59d0aea646d322c625a7c5df8f00486a23aed3fa/merged major:0 minor:162 fsType:overlay blockSize:0} overlay_0-165:{mountpoint:/var/lib/containers/storage/overlay/5d44551ffb5361ef0038f2a5c834c8081fc1d28cca91f3020356d7b9ec601e22/merged major:0 minor:165 fsType:overlay blockSize:0} overlay_0-176:{mountpoint:/var/lib/containers/storage/overlay/dc7459596b41efabb85efa8a4232d0650f2220dd0b42485b9227b721eb903142/merged major:0 minor:176 fsType:overlay blockSize:0} overlay_0-184:{mountpoint:/var/lib/containers/storage/overlay/a4939552f5e4ff970629e6e2dbdfdc01e7c6d473dfc4fedc68a9f917226a78e6/merged major:0 minor:184 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/6205811f38af910b672d6509ddd746f40e600578e003fe0e87ba6f6b37b08a9f/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-194:{mountpoint:/var/lib/containers/storage/overlay/192e8097cd63a2bcb58516710fe65a40087b0721e1713ca7c191d04452bfb462/merged major:0 minor:194 fsType:overlay blockSize:0} overlay_0-199:{mountpoint:/var/lib/containers/storage/overlay/e33cce358066d48b4605417812ab3130253ee6d6a268a012a37cb3108af75a9a/merged major:0 minor:199 fsType:overlay blockSize:0} overlay_0-204:{mountpoint:/var/lib/containers/storage/overlay/41fdb8de4747eea147fdef88bb0244292ee4fe2c57a274177913bcc925d9c778/merged major:0 minor:204 fsType:overlay blockSize:0} overlay_0-247:{mountpoint:/var/lib/containers/storage/overlay/60460309109fa213629ccac84410abe1463026fc9ace85699a0982187139cf85/merged major:0 minor:247 fsType:overlay blockSize:0} overlay_0-257:{mountpoint:/var/lib/containers/storage/overlay/f05abb2f329b589e4e59ece7f196bba601bd9189e171b475bcf714ae2f01f7fb/merged major:0 minor:257 fsType:overlay blockSize:0} overlay_0-264:{mountpoint:/var/lib/containers/storage/overlay/6ac1b2db9b9e9f6953e526b0a759095a6277c2f58374b028d3549888e57e033e/merged major:0 minor:264 fsType:overlay blockSize:0} overlay_0-266:{mountpoint:/var/lib/containers/storage/overlay/d53e4460daa3867366faebedffce96d156dc47dee7cfc1e2a58575b965849830/merged major:0 minor:266 fsType:overlay blockSize:0} overlay_0-273:{mountpoint:/var/lib/containers/storage/overlay/e41954e50b7df848de86ee8ec7a85d8d1957ce444a3ff0b5ffa3d76d4fc00b59/merged major:0 minor:273 fsType:overlay blockSize:0} overlay_0-285:{mountpoint:/var/lib/containers/storage/overlay/118779ca8538f4e3430011d4bc8bb91869cfe242e883e528305ddc2ddebce585/merged major:0 minor:285 fsType:overlay blockSize:0} overlay_0-287:{mountpoint:/var/lib/containers/storage/overlay/87f5516508b5a9fb425ff147a188f8c50995524820e416bc408b215c9391f54b/merged major:0 minor:287 fsType:overlay blockSize:0} overlay_0-289:{mountpoint:/var/lib/containers/storage/overlay/92524f63eb0719d9216a69cc8ff6d41a4c28199044cec7ff060a8507a8235f24/merged major:0 minor:289 fsType:overlay blockSize:0} overlay_0-291:{mountpoint:/var/lib/containers/storage/overlay/41290067078e96f0e73a83864d568f75bb0da2062ea687985842db643540d70b/merged major:0 minor:291 fsType:overlay blockSize:0} overlay_0-293:{mountpoint:/var/lib/containers/storage/overlay/131a0d91350a2fa9177e2706727a37a785ef558d75a753c85c5515977845930c/merged major:0 minor:293 fsType:overlay blockSize:0} overlay_0-295:{mountpoint:/var/lib/containers/storage/overlay/305a17411a200e62d5f3c47734017b210e55850da737ecd47056016c85c5c0aa/merged major:0 minor:295 fsType:overlay blockSize:0} overlay_0-297:{mountpoint:/var/lib/containers/storage/overlay/e1fda7696c6f76d4e5ae315ea02c6c5eaedd1538dc7e295c559f266663cb976d/merged major:0 minor:297 fsType:overlay blockSize:0} overlay_0-299:{mountpoint:/var/lib/containers/storage/overlay/9d508a245d78633dbb5414b178957a0e60fc4a1937a644c5d2fb394bff2ae30c/merged major:0 minor:299 fsType:overlay blockSize:0} overlay_0-301:{mountpoint:/var/lib/containers/storage/overlay/5b49897fc3a8c4f71d497894a79ed190feffbf63536294a0810eb9bc61384ae0/merged major:0 minor:301 fsType:overlay blockSize:0} overlay_0-44:{mountpoint:/var/lib/containers/storage/overlay/e9e30509c242a04163aa2fd2dd37213dbc6fc203316c4ad0b34e15b260a6966a/merged major:0 minor:44 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/a70e0382f07837be5270314eb686264885a5f0c090da96da41f6fbea5f5bbe60/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/9ef51aee0cd3466360df38b51e0a8808e8f2aafc5d7443e6a7afef98fa7d9883/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/be48a5d2c544556abd11be88d7cf9c6540db6dc87af734596fed0d353b73f928/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/172f22de6c6c106d3f72304523252ea7447bf91e9ed7d8d24a14823906664e0d/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/ee15ca68b05a98d64dfed07c9c07537881dadb7cd2f45133fcd8717723f42a60/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-64:{mountpoint:/var/lib/containers/storage/overlay/b3f15bc12a376e6c0d52b922bd0ed9ebc8922a524a545018b1de484869ad1182/merged major:0 minor:64 fsType:overlay blockSize:0} overlay_0-66:{mountpoint:/var/lib/containers/storage/overlay/a5b1deae682495d7d8aa00fb8088a1e07acd496f9bc7ba1b4a00744fccfa1383/merged major:0 minor:66 fsType:overlay blockSize:0} overlay_0-68:{mountpoint:/var/lib/containers/storage/overlay/341171278ca209f7d32ad8217c6c629b2e89789c5fcc90329e7254862871b85c/merged major:0 minor:68 fsType:overlay blockSize:0} overlay_0-70:{mountpoint:/var/lib/containers/storage/overlay/a42d9bd1e2c3fc3201f9f4fa1599dbf49bb67c5a5553fa23cc34254b9523764a/merged major:0 minor:70 fsType:overlay blockSize:0} overlay_0-84:{mountpoint:/var/lib/containers/storage/overlay/b06fbabf02175130808b95f0811c9a0218b406b2ff1c41d2542f77ffda5633ef/merged major:0 minor:84 fsType:overlay blockSize:0} overlay_0-89:{mountpoint:/var/lib/containers/storage/overlay/88fc2022854436652a1bfa9b96fd4e04776e63bcbd05915babdc36b4313f5105/merged major:0 minor:89 fsType:overlay blockSize:0}] Mar 19 11:52:55.807740 master-0 kubenswrapper[6932]: I0319 11:52:55.806548 6932 manager.go:217] Machine: {Timestamp:2026-03-19 11:52:55.805837998 +0000 UTC m=+0.164898240 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:7514b5d6ada747ba9a1e5c7e73d4e6d3 SystemUUID:7514b5d6-ada7-47ba-9a1e-5c7e73d4e6d3 BootID:bab7eb38-7ae5-4f9e-8147-39f837056abe Filesystems:[{Device:overlay_0-101 DeviceMajor:0 DeviceMinor:101 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e5078f17-bc65-460f-9f18-8c506db6840b/volumes/kubernetes.io~projected/kube-api-access-s5rm4 DeviceMajor:0 DeviceMinor:215 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-287 DeviceMajor:0 DeviceMinor:287 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-138 DeviceMajor:0 DeviceMinor:138 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4e2c195f-e97d-4cac-81fc-2d5c551d1c30/volumes/kubernetes.io~projected/kube-api-access-kgz7q DeviceMajor:0 DeviceMinor:242 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-297 DeviceMajor:0 DeviceMinor:297 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/dbcbba74-ac53-4724-a217-4d9b85e7c1db/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:238 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-273 DeviceMajor:0 DeviceMinor:273 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-293 DeviceMajor:0 DeviceMinor:293 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/732989c5-1b89-46f0-9917-b68613f7f005/volumes/kubernetes.io~projected/kube-api-access-bfvz6 DeviceMajor:0 DeviceMinor:216 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-285 DeviceMajor:0 DeviceMinor:285 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3c3b0d24-ce5e-49c3-a546-874356f75dc6/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:43 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:227 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/66f88242-8b0b-4790-bbb6-445c19b34ee7/volumes/kubernetes.io~projected/kube-api-access-p5fnx DeviceMajor:0 DeviceMinor:245 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/9b61ea14-a7ea-49f3-9df4-5655765ddf7c/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:253 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-134 DeviceMajor:0 DeviceMinor:134 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9b61ea14-a7ea-49f3-9df4-5655765ddf7c/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:229 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7444b7503d7740b7e0cd43f84f6cce1196456b0e8df5c1dc67a1f73e2797cf61/userdata/shm DeviceMajor:0 DeviceMinor:240 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/333047c4-aeca-410e-9393-ca4e74366921/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:98 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d7889469ef63ab146c50d169dc4f57ff3c6e05bfe52d83c88832208089809932/userdata/shm DeviceMajor:0 DeviceMinor:119 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/716c2176-50f9-4c4f-af0e-4c7973457df2/volumes/kubernetes.io~projected/kube-api-access-m8bmw DeviceMajor:0 DeviceMinor:217 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/28a32f59656edf5ebf4428eb19343f79c79bdc3e9a5ed63a5fa7185ccacbd30e/userdata/shm DeviceMajor:0 DeviceMinor:282 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/run/containers/storage/overlay-containers/67cce31157aba8cd32c19fdb97a814cf6764c07048a060294135b6ce20e85f0e/userdata/shm DeviceMajor:0 DeviceMinor:275 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:235 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-108 DeviceMajor:0 DeviceMinor:108 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/163d6a3d-0080-4122-bb7a-17f6e63f00f0/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:243 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-299 DeviceMajor:0 DeviceMinor:299 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c7fce19a33a5dd46ce06e3ec2001f8aae0d2c521be7c2647e59448b0833408c9/userdata/shm DeviceMajor:0 DeviceMinor:46 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-84 DeviceMajor:0 DeviceMinor:84 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ee84c91e209b8d15a57102d97b9b923b7a0a0247657f697f48616f38ce178b0b/userdata/shm DeviceMajor:0 DeviceMinor:221 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-257 DeviceMajor:0 DeviceMinor:257 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8438d015-106b-4aed-ae12-dda781ce51fc/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:140 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3053504d-0734-4def-b639-0f5cc2178185/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:126 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-204 DeviceMajor:0 DeviceMinor:204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-68 DeviceMajor:0 DeviceMinor:68 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-64 DeviceMajor:0 DeviceMinor:64 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-89 DeviceMajor:0 DeviceMinor:89 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f5d73fef-1414-4b29-97ea-42e1c0b1ef18/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:209 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/74d7d2df3602ec247c94c7641e1ca1523b5ae6b42624ca797fbd2b6225dfbfa4/userdata/shm DeviceMajor:0 DeviceMinor:260 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b3de8a1b-a5be-414f-86e8-738e16c8bc97/volumes/kubernetes.io~projected/kube-api-access-nlr9q DeviceMajor:0 DeviceMinor:270 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/159515e88e5e657c5dc1a45dfc38f542a76bac41085e0be14941a32b19e214ef/userdata/shm DeviceMajor:0 DeviceMinor:276 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/09a22c25-6073-4b1a-a029-928452ef37db/volumes/kubernetes.io~projected/kube-api-access-xx4wk DeviceMajor:0 DeviceMinor:105 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-132 DeviceMajor:0 DeviceMinor:132 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/732989c5-1b89-46f0-9917-b68613f7f005/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:213 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/dbcbba74-ac53-4724-a217-4d9b85e7c1db/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:232 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8797022c969de9642db09ed804cdf4aed14c8648d4f8b5b9c9f88a55664979e8/userdata/shm DeviceMajor:0 DeviceMinor:280 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/163d6a3d-0080-4122-bb7a-17f6e63f00f0/volumes/kubernetes.io~projected/kube-api-access-m7tc5 DeviceMajor:0 DeviceMinor:269 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8438d015-106b-4aed-ae12-dda781ce51fc/volumes/kubernetes.io~projected/kube-api-access-cqr6w DeviceMajor:0 DeviceMinor:148 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76/volumes/kubernetes.io~projected/kube-api-access-dnl28 DeviceMajor:0 DeviceMinor:252 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-264 DeviceMajor:0 DeviceMinor:264 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-295 DeviceMajor:0 DeviceMinor:295 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/daf4dbb6-5a0a-4c92-a930-479a7330ace1/volumes/kubernetes.io~projected/kube-api-access-72jlb DeviceMajor:0 DeviceMinor:125 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-156 DeviceMajor:0 DeviceMinor:156 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:228 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/857137fd3aca8af8c5c19bcaeff329a322e9a54b7ff7f19d360c176d0e68cab5/userdata/shm DeviceMajor:0 DeviceMinor:254 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-121 DeviceMajor:0 DeviceMinor:121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2151069adcf5d6126fb57190dc2ec941b6dc342421174da0283b995f56e1641b/userdata/shm DeviceMajor:0 DeviceMinor:129 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf/volumes/kubernetes.io~projected/kube-api-access-nds54 DeviceMajor:0 DeviceMinor:251 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/daf4dbb6-5a0a-4c92-a930-479a7330ace1/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:124 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-301 DeviceMajor:0 DeviceMinor:301 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-44 DeviceMajor:0 DeviceMinor:44 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/39d3ac31-9259-454b-8e1c-e23024f8f2b2/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:225 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d/volumes/kubernetes.io~projected/kube-api-access-qql5t DeviceMajor:0 DeviceMinor:234 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b54d0875a5c74a95cdb12684066d437927027dd749aa30fdd27e9a88de808b47/userdata/shm DeviceMajor:0 DeviceMinor:236 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-289 DeviceMajor:0 DeviceMinor:289 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-194 DeviceMajor:0 DeviceMinor:194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7/volumes/kubernetes.io~projected/kube-api-access-7spvn DeviceMajor:0 DeviceMinor:246 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-162 DeviceMajor:0 DeviceMinor:162 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:224 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1/volumes/kubernetes.io~projected/kube-api-access-46m89 DeviceMajor:0 DeviceMinor:239 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-247 DeviceMajor:0 DeviceMinor:247 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/aaaaf539-bf61-44d7-8d47-97535b7aa1ba/volumes/kubernetes.io~projected/kube-api-access-7nfnb DeviceMajor:0 DeviceMinor:268 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3ca871a2e4c187593092b1e6a4a9637d7435e4628b01bcadfea7c6a9560eeb21/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9/volumes/kubernetes.io~projected/kube-api-access-79qrb DeviceMajor:0 DeviceMinor:233 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-165 DeviceMajor:0 DeviceMinor:165 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f5d73fef-1414-4b29-97ea-42e1c0b1ef18/volumes/kubernetes.io~projected/kube-api-access-v27lg DeviceMajor:0 DeviceMinor:219 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a3ceeece-bee9-4fcb-8517-95ebce38e223/volumes/kubernetes.io~projected/kube-api-access-zrgqb DeviceMajor:0 DeviceMinor:255 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-160 DeviceMajor:0 DeviceMinor:160 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-176 DeviceMajor:0 DeviceMinor:176 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6611e325-6152-480c-9c2c-1b503e49ccd2/volumes/kubernetes.io~projected/kube-api-access-4p4hg DeviceMajor:0 DeviceMinor:244 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d9bf0e017da39714ca0d58a2ba0c46cd89a43ae7f317d13dbb6e31831feeb576/userdata/shm DeviceMajor:0 DeviceMinor:271 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/run/containers/storage/overlay-containers/cc43d8901b61b03f0bff74bea5349f358784d720e2984f56ccc961dc3f630856/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-70 DeviceMajor:0 DeviceMinor:70 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-103 DeviceMajor:0 DeviceMinor:103 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cf08ab4f-c203-4c16-9826-8cc049f4af31/volumes/kubernetes.io~projected/kube-api-access-lkm97 DeviceMajor:0 DeviceMinor:214 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-291 DeviceMajor:0 DeviceMinor:291 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/23f47642dd95b352c86bf3516967ac9ae86ccfd441d6afb36a3e2d4a5c622a4a/userdata/shm DeviceMajor:0 DeviceMinor:144 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-150 DeviceMajor:0 DeviceMinor:150 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/22e10648-af7c-409e-b947-570e7d807e05/volumes/kubernetes.io~projected/kube-api-access-wls49 DeviceMajor:0 DeviceMinor:220 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/27419838ac6bb228f6151c74e466e550ee30c7ce1c14772f63c150dcd524d6e7/userdata/shm DeviceMajor:0 DeviceMinor:279 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3053504d-0734-4def-b639-0f5cc2178185/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/6611e325-6152-480c-9c2c-1b503e49ccd2/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:226 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:230 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-266 DeviceMajor:0 DeviceMinor:266 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-66 DeviceMajor:0 DeviceMinor:66 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a387a6fb603981d31a2529e0731ac72c41f84be90202777248f07296f1eb9d6b/userdata/shm DeviceMajor:0 DeviceMinor:99 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f29b11ce-60e0-46b3-8d28-eea3452513cd/volumes/kubernetes.io~projected/kube-api-access-bgs4l DeviceMajor:0 DeviceMinor:123 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3053504d-0734-4def-b639-0f5cc2178185/volumes/kubernetes.io~projected/kube-api-access-2bb2x DeviceMajor:0 DeviceMinor:127 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/66f88242-8b0b-4790-bbb6-445c19b34ee7/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:231 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5359d955256489cf75babf6cd7e374f24ca5753414f295ec115bac354fbe37e1/userdata/shm DeviceMajor:0 DeviceMinor:249 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/39d3ac31-9259-454b-8e1c-e23024f8f2b2/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:259 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/25dfae9bb0843173d90c844dacf16818cb3d6d61cb972bb6cd1177b47a320778/userdata/shm DeviceMajor:0 DeviceMinor:262 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c21d5cdcf33dc5445d398db5efae2e61668498b313fd2a8200f2011b9857d1d4/userdata/shm DeviceMajor:0 DeviceMinor:50 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/34e072f72c93d6369874c6beffaed27fe7a497ddbd4993eb86f92f576e79b6ab/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d016f849de27f64d027bbd73120eb329b0253680086fad1c1a5d1d59daba5c27/userdata/shm DeviceMajor:0 DeviceMinor:111 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/88f5dfffda4adf62f6636e4646d2c851172ef321255a628934b6453ee67c8f03/userdata/shm DeviceMajor:0 DeviceMinor:128 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-152 DeviceMajor:0 DeviceMinor:152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-199 DeviceMajor:0 DeviceMinor:199 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2292109e-92a9-4286-858e-dcd2ac083c43/volumes/kubernetes.io~projected/kube-api-access-8rt57 DeviceMajor:0 DeviceMinor:218 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a3ceeece-bee9-4fcb-8517-95ebce38e223/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:223 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3c3b0d24-ce5e-49c3-a546-874356f75dc6/volumes/kubernetes.io~projected/kube-api-access-pngsr DeviceMajor:0 DeviceMinor:94 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-116 DeviceMajor:0 DeviceMinor:116 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a/volumes/kubernetes.io~projected/kube-api-access-lqcvx DeviceMajor:0 DeviceMinor:118 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-184 DeviceMajor:0 DeviceMinor:184 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:159515e88e5e657 MacAddress:ea:1e:ac:bd:92:e0 Speed:10000 Mtu:8900} {Name:27419838ac6bb22 MacAddress:d2:e6:d1:53:c7:dd Speed:10000 Mtu:8900} {Name:28a32f59656edf5 MacAddress:ea:22:69:45:52:8e Speed:10000 Mtu:8900} {Name:5359d955256489c MacAddress:1a:a2:18:65:79:1c Speed:10000 Mtu:8900} {Name:67cce31157aba8c MacAddress:a2:bc:58:e8:51:35 Speed:10000 Mtu:8900} {Name:7444b7503d7740b MacAddress:46:4f:91:cb:b3:6d Speed:10000 Mtu:8900} {Name:74d7d2df3602ec2 MacAddress:c2:68:a6:39:da:fb Speed:10000 Mtu:8900} {Name:857137fd3aca8af MacAddress:3a:2a:7d:8a:54:d8 Speed:10000 Mtu:8900} {Name:8797022c969de96 MacAddress:e6:77:37:0f:25:94 Speed:10000 Mtu:8900} {Name:b54d0875a5c74a9 MacAddress:aa:0e:5c:f8:98:4c Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:a2:b3:fa:06:64:87 Speed:0 Mtu:8900} {Name:d9bf0e017da3971 MacAddress:9e:4b:91:5e:1d:32 Speed:10000 Mtu:8900} {Name:ee84c91e209b8d1 MacAddress:9e:0f:da:81:b7:77 Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:3b:cf:f0 Speed:-1 Mtu:9000} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:0e:7e:5d:df:5c:77 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 19 11:52:55.807740 master-0 kubenswrapper[6932]: I0319 11:52:55.807695 6932 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 19 11:52:55.808252 master-0 kubenswrapper[6932]: I0319 11:52:55.807932 6932 manager.go:233] Version: {KernelVersion:5.14.0-427.113.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202603021444-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 19 11:52:55.808252 master-0 kubenswrapper[6932]: I0319 11:52:55.808108 6932 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 19 11:52:55.808252 master-0 kubenswrapper[6932]: I0319 11:52:55.808220 6932 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 19 11:52:55.812758 master-0 kubenswrapper[6932]: I0319 11:52:55.808247 6932 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 19 11:52:55.812758 master-0 kubenswrapper[6932]: I0319 11:52:55.811030 6932 topology_manager.go:138] "Creating topology manager with none policy" Mar 19 11:52:55.812758 master-0 kubenswrapper[6932]: I0319 11:52:55.811056 6932 container_manager_linux.go:303] "Creating device plugin manager" Mar 19 11:52:55.812758 master-0 kubenswrapper[6932]: I0319 11:52:55.811068 6932 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 11:52:55.812758 master-0 kubenswrapper[6932]: I0319 11:52:55.811103 6932 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 11:52:55.812758 master-0 kubenswrapper[6932]: I0319 11:52:55.811342 6932 state_mem.go:36] "Initialized new in-memory state store" Mar 19 11:52:55.812758 master-0 kubenswrapper[6932]: I0319 11:52:55.811491 6932 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 19 11:52:55.812758 master-0 kubenswrapper[6932]: I0319 11:52:55.811572 6932 kubelet.go:418] "Attempting to sync node with API server" Mar 19 11:52:55.812758 master-0 kubenswrapper[6932]: I0319 11:52:55.811620 6932 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 19 11:52:55.812758 master-0 kubenswrapper[6932]: I0319 11:52:55.811639 6932 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 19 11:52:55.812758 master-0 kubenswrapper[6932]: I0319 11:52:55.811653 6932 kubelet.go:324] "Adding apiserver pod source" Mar 19 11:52:55.812758 master-0 kubenswrapper[6932]: I0319 11:52:55.811682 6932 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 19 11:52:55.817916 master-0 kubenswrapper[6932]: I0319 11:52:55.817875 6932 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 11:52:55.818504 master-0 kubenswrapper[6932]: I0319 11:52:55.818467 6932 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 19 11:52:55.818681 master-0 kubenswrapper[6932]: I0319 11:52:55.818652 6932 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 19 11:52:55.818957 master-0 kubenswrapper[6932]: I0319 11:52:55.818938 6932 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 19 11:52:55.819082 master-0 kubenswrapper[6932]: I0319 11:52:55.819064 6932 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 19 11:52:55.819128 master-0 kubenswrapper[6932]: I0319 11:52:55.819090 6932 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 19 11:52:55.819128 master-0 kubenswrapper[6932]: I0319 11:52:55.819100 6932 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 19 11:52:55.819128 master-0 kubenswrapper[6932]: I0319 11:52:55.819127 6932 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 19 11:52:55.819222 master-0 kubenswrapper[6932]: I0319 11:52:55.819137 6932 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 19 11:52:55.819222 master-0 kubenswrapper[6932]: I0319 11:52:55.819144 6932 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 19 11:52:55.819222 master-0 kubenswrapper[6932]: I0319 11:52:55.819151 6932 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 19 11:52:55.819222 master-0 kubenswrapper[6932]: I0319 11:52:55.819157 6932 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 19 11:52:55.819222 master-0 kubenswrapper[6932]: I0319 11:52:55.819165 6932 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 19 11:52:55.819222 master-0 kubenswrapper[6932]: I0319 11:52:55.819173 6932 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 19 11:52:55.819222 master-0 kubenswrapper[6932]: I0319 11:52:55.819189 6932 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 19 11:52:55.819222 master-0 kubenswrapper[6932]: I0319 11:52:55.819203 6932 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 19 11:52:55.819435 master-0 kubenswrapper[6932]: I0319 11:52:55.819241 6932 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 19 11:52:55.819829 master-0 kubenswrapper[6932]: I0319 11:52:55.819654 6932 server.go:1280] "Started kubelet" Mar 19 11:52:55.819829 master-0 kubenswrapper[6932]: I0319 11:52:55.819754 6932 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 19 11:52:55.820806 master-0 kubenswrapper[6932]: I0319 11:52:55.819867 6932 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 19 11:52:55.820806 master-0 kubenswrapper[6932]: I0319 11:52:55.820010 6932 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 19 11:52:55.821161 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 19 11:52:55.827805 master-0 kubenswrapper[6932]: I0319 11:52:55.824146 6932 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 19 11:52:55.827805 master-0 kubenswrapper[6932]: I0319 11:52:55.825091 6932 server.go:449] "Adding debug handlers to kubelet server" Mar 19 11:52:55.827805 master-0 kubenswrapper[6932]: I0319 11:52:55.825259 6932 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 11:52:55.830789 master-0 kubenswrapper[6932]: I0319 11:52:55.830374 6932 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 19 11:52:55.830789 master-0 kubenswrapper[6932]: I0319 11:52:55.830423 6932 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 19 11:52:55.830789 master-0 kubenswrapper[6932]: I0319 11:52:55.830514 6932 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 19 11:52:55.830789 master-0 kubenswrapper[6932]: I0319 11:52:55.830523 6932 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 19 11:52:55.830789 master-0 kubenswrapper[6932]: I0319 11:52:55.830495 6932 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-20 11:43:03 +0000 UTC, rotation deadline is 2026-03-20 06:44:01.750175099 +0000 UTC Mar 19 11:52:55.830789 master-0 kubenswrapper[6932]: I0319 11:52:55.830553 6932 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 18h51m5.919624948s for next certificate rotation Mar 19 11:52:55.830789 master-0 kubenswrapper[6932]: I0319 11:52:55.830624 6932 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 19 11:52:55.832925 master-0 kubenswrapper[6932]: I0319 11:52:55.832892 6932 factory.go:55] Registering systemd factory Mar 19 11:52:55.832981 master-0 kubenswrapper[6932]: I0319 11:52:55.832936 6932 factory.go:221] Registration of the systemd container factory successfully Mar 19 11:52:55.833287 master-0 kubenswrapper[6932]: I0319 11:52:55.833224 6932 factory.go:153] Registering CRI-O factory Mar 19 11:52:55.833287 master-0 kubenswrapper[6932]: I0319 11:52:55.833248 6932 factory.go:221] Registration of the crio container factory successfully Mar 19 11:52:55.833360 master-0 kubenswrapper[6932]: I0319 11:52:55.833347 6932 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 19 11:52:55.833394 master-0 kubenswrapper[6932]: I0319 11:52:55.833373 6932 factory.go:103] Registering Raw factory Mar 19 11:52:55.833394 master-0 kubenswrapper[6932]: I0319 11:52:55.833368 6932 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 11:52:55.833394 master-0 kubenswrapper[6932]: I0319 11:52:55.833388 6932 manager.go:1196] Started watching for new ooms in manager Mar 19 11:52:55.834203 master-0 kubenswrapper[6932]: I0319 11:52:55.834183 6932 manager.go:319] Starting recovery of all containers Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.834684 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3053504d-0734-4def-b639-0f5cc2178185" volumeName="kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-ovnkube-config" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.834755 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3053504d-0734-4def-b639-0f5cc2178185" volumeName="kubernetes.io/secret/3053504d-0734-4def-b639-0f5cc2178185-ovn-node-metrics-cert" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.834766 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="732989c5-1b89-46f0-9917-b68613f7f005" volumeName="kubernetes.io/projected/732989c5-1b89-46f0-9917-b68613f7f005-kube-api-access-bfvz6" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.834775 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8fe4839d-cef4-4ec9-b146-2ae9b76d8a76" volumeName="kubernetes.io/secret/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-serving-cert" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.834784 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e5078f17-bc65-460f-9f18-8c506db6840b" volumeName="kubernetes.io/projected/e5078f17-bc65-460f-9f18-8c506db6840b-kube-api-access-s5rm4" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.834797 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="163d6a3d-0080-4122-bb7a-17f6e63f00f0" volumeName="kubernetes.io/configmap/163d6a3d-0080-4122-bb7a-17f6e63f00f0-trusted-ca" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.834810 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a3ceeece-bee9-4fcb-8517-95ebce38e223" volumeName="kubernetes.io/projected/a3ceeece-bee9-4fcb-8517-95ebce38e223-kube-api-access-zrgqb" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.834819 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1" volumeName="kubernetes.io/configmap/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-trusted-ca" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.834830 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" volumeName="kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cni-binary-copy" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.834944 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="daf4dbb6-5a0a-4c92-a930-479a7330ace1" volumeName="kubernetes.io/configmap/daf4dbb6-5a0a-4c92-a930-479a7330ace1-env-overrides" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.834988 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dbcbba74-ac53-4724-a217-4d9b85e7c1db" volumeName="kubernetes.io/secret/dbcbba74-ac53-4724-a217-4d9b85e7c1db-serving-cert" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.835000 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf" volumeName="kubernetes.io/secret/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-serving-cert" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.835008 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8fe4839d-cef4-4ec9-b146-2ae9b76d8a76" volumeName="kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-config" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.835035 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8438d015-106b-4aed-ae12-dda781ce51fc" volumeName="kubernetes.io/configmap/8438d015-106b-4aed-ae12-dda781ce51fc-ovnkube-identity-cm" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.835044 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a3ceeece-bee9-4fcb-8517-95ebce38e223" volumeName="kubernetes.io/empty-dir/a3ceeece-bee9-4fcb-8517-95ebce38e223-available-featuregates" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.835053 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cf08ab4f-c203-4c16-9826-8cc049f4af31" volumeName="kubernetes.io/projected/cf08ab4f-c203-4c16-9826-8cc049f4af31-kube-api-access-lkm97" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.835061 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="daf4dbb6-5a0a-4c92-a930-479a7330ace1" volumeName="kubernetes.io/projected/daf4dbb6-5a0a-4c92-a930-479a7330ace1-kube-api-access-72jlb" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.835096 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="732989c5-1b89-46f0-9917-b68613f7f005" volumeName="kubernetes.io/secret/732989c5-1b89-46f0-9917-b68613f7f005-serving-cert" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.835106 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8fe4839d-cef4-4ec9-b146-2ae9b76d8a76" volumeName="kubernetes.io/secret/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-client" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.835115 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9b61ea14-a7ea-49f3-9df4-5655765ddf7c" volumeName="kubernetes.io/projected/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-kube-api-access" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.835124 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3c3b0d24-ce5e-49c3-a546-874356f75dc6" volumeName="kubernetes.io/projected/3c3b0d24-ce5e-49c3-a546-874356f75dc6-kube-api-access-pngsr" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.835133 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3053504d-0734-4def-b639-0f5cc2178185" volumeName="kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-ovnkube-script-lib" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.835160 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1" volumeName="kubernetes.io/projected/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-kube-api-access-46m89" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.835170 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="163d6a3d-0080-4122-bb7a-17f6e63f00f0" volumeName="kubernetes.io/projected/163d6a3d-0080-4122-bb7a-17f6e63f00f0-kube-api-access-m7tc5" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.835195 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9b61ea14-a7ea-49f3-9df4-5655765ddf7c" volumeName="kubernetes.io/secret/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-serving-cert" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.835218 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f5d73fef-1414-4b29-97ea-42e1c0b1ef18" volumeName="kubernetes.io/configmap/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-config" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.835256 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f5d73fef-1414-4b29-97ea-42e1c0b1ef18" volumeName="kubernetes.io/secret/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-serving-cert" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.835266 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8438d015-106b-4aed-ae12-dda781ce51fc" volumeName="kubernetes.io/secret/8438d015-106b-4aed-ae12-dda781ce51fc-webhook-cert" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.835274 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="66f88242-8b0b-4790-bbb6-445c19b34ee7" volumeName="kubernetes.io/configmap/66f88242-8b0b-4790-bbb6-445c19b34ee7-config" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.835298 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf" volumeName="kubernetes.io/configmap/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-config" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.835307 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2292109e-92a9-4286-858e-dcd2ac083c43" volumeName="kubernetes.io/projected/2292109e-92a9-4286-858e-dcd2ac083c43-kube-api-access-8rt57" seLinuxMountContext="" Mar 19 11:52:55.835269 master-0 kubenswrapper[6932]: I0319 11:52:55.835316 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="732989c5-1b89-46f0-9917-b68613f7f005" volumeName="kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-trusted-ca-bundle" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835325 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8438d015-106b-4aed-ae12-dda781ce51fc" volumeName="kubernetes.io/projected/8438d015-106b-4aed-ae12-dda781ce51fc-kube-api-access-cqr6w" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835349 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8fe4839d-cef4-4ec9-b146-2ae9b76d8a76" volumeName="kubernetes.io/projected/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-kube-api-access-dnl28" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835364 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d" volumeName="kubernetes.io/configmap/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-config" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835374 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d" volumeName="kubernetes.io/projected/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-kube-api-access-qql5t" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835382 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" volumeName="kubernetes.io/projected/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-kube-api-access-lqcvx" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835391 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="daf4dbb6-5a0a-4c92-a930-479a7330ace1" volumeName="kubernetes.io/secret/daf4dbb6-5a0a-4c92-a930-479a7330ace1-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835400 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22e10648-af7c-409e-b947-570e7d807e05" volumeName="kubernetes.io/projected/22e10648-af7c-409e-b947-570e7d807e05-kube-api-access-wls49" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835409 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="333047c4-aeca-410e-9393-ca4e74366921" volumeName="kubernetes.io/configmap/333047c4-aeca-410e-9393-ca4e74366921-service-ca" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835423 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4e2c195f-e97d-4cac-81fc-2d5c551d1c30" volumeName="kubernetes.io/configmap/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-iptables-alerter-script" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835453 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4e2c195f-e97d-4cac-81fc-2d5c551d1c30" volumeName="kubernetes.io/projected/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-kube-api-access-kgz7q" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835462 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9" volumeName="kubernetes.io/projected/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-kube-api-access-79qrb" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835471 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="716c2176-50f9-4c4f-af0e-4c7973457df2" volumeName="kubernetes.io/projected/716c2176-50f9-4c4f-af0e-4c7973457df2-kube-api-access-m8bmw" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835479 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dbcbba74-ac53-4724-a217-4d9b85e7c1db" volumeName="kubernetes.io/projected/dbcbba74-ac53-4724-a217-4d9b85e7c1db-kube-api-access" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835527 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f29b11ce-60e0-46b3-8d28-eea3452513cd" volumeName="kubernetes.io/projected/f29b11ce-60e0-46b3-8d28-eea3452513cd-kube-api-access-bgs4l" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835565 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3053504d-0734-4def-b639-0f5cc2178185" volumeName="kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-env-overrides" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835608 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf" volumeName="kubernetes.io/projected/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-kube-api-access-nds54" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835618 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="39d3ac31-9259-454b-8e1c-e23024f8f2b2" volumeName="kubernetes.io/projected/39d3ac31-9259-454b-8e1c-e23024f8f2b2-kube-api-access" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835641 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="39d3ac31-9259-454b-8e1c-e23024f8f2b2" volumeName="kubernetes.io/secret/39d3ac31-9259-454b-8e1c-e23024f8f2b2-serving-cert" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835655 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8438d015-106b-4aed-ae12-dda781ce51fc" volumeName="kubernetes.io/configmap/8438d015-106b-4aed-ae12-dda781ce51fc-env-overrides" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835665 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8fe4839d-cef4-4ec9-b146-2ae9b76d8a76" volumeName="kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-service-ca" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835737 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b3de8a1b-a5be-414f-86e8-738e16c8bc97" volumeName="kubernetes.io/projected/b3de8a1b-a5be-414f-86e8-738e16c8bc97-kube-api-access-nlr9q" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835748 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="daf4dbb6-5a0a-4c92-a930-479a7330ace1" volumeName="kubernetes.io/configmap/daf4dbb6-5a0a-4c92-a930-479a7330ace1-ovnkube-config" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835758 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f5d73fef-1414-4b29-97ea-42e1c0b1ef18" volumeName="kubernetes.io/projected/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-kube-api-access-v27lg" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835767 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="163d6a3d-0080-4122-bb7a-17f6e63f00f0" volumeName="kubernetes.io/projected/163d6a3d-0080-4122-bb7a-17f6e63f00f0-bound-sa-token" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835776 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3053504d-0734-4def-b639-0f5cc2178185" volumeName="kubernetes.io/projected/3053504d-0734-4def-b639-0f5cc2178185-kube-api-access-2bb2x" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835808 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6611e325-6152-480c-9c2c-1b503e49ccd2" volumeName="kubernetes.io/empty-dir/6611e325-6152-480c-9c2c-1b503e49ccd2-operand-assets" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835822 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6611e325-6152-480c-9c2c-1b503e49ccd2" volumeName="kubernetes.io/projected/6611e325-6152-480c-9c2c-1b503e49ccd2-kube-api-access-4p4hg" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835835 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9" volumeName="kubernetes.io/configmap/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-telemetry-config" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835844 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aaaaf539-bf61-44d7-8d47-97535b7aa1ba" volumeName="kubernetes.io/configmap/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-trusted-ca" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835852 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" volumeName="kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-whereabouts-flatfile-configmap" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835861 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dbcbba74-ac53-4724-a217-4d9b85e7c1db" volumeName="kubernetes.io/configmap/dbcbba74-ac53-4724-a217-4d9b85e7c1db-config" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835870 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09a22c25-6073-4b1a-a029-928452ef37db" volumeName="kubernetes.io/projected/09a22c25-6073-4b1a-a029-928452ef37db-kube-api-access-xx4wk" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835878 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="732989c5-1b89-46f0-9917-b68613f7f005" volumeName="kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-service-ca-bundle" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835933 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="89cf2ee8-3664-4502-b70c-b7e0a5e92cb7" volumeName="kubernetes.io/projected/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-kube-api-access-7spvn" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835944 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aaaaf539-bf61-44d7-8d47-97535b7aa1ba" volumeName="kubernetes.io/projected/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-kube-api-access-7nfnb" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835953 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d" volumeName="kubernetes.io/secret/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-serving-cert" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835962 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09a22c25-6073-4b1a-a029-928452ef37db" volumeName="kubernetes.io/configmap/09a22c25-6073-4b1a-a029-928452ef37db-multus-daemon-config" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835969 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="39d3ac31-9259-454b-8e1c-e23024f8f2b2" volumeName="kubernetes.io/configmap/39d3ac31-9259-454b-8e1c-e23024f8f2b2-config" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.835977 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3c3b0d24-ce5e-49c3-a546-874356f75dc6" volumeName="kubernetes.io/secret/3c3b0d24-ce5e-49c3-a546-874356f75dc6-metrics-tls" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.836000 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="66f88242-8b0b-4790-bbb6-445c19b34ee7" volumeName="kubernetes.io/secret/66f88242-8b0b-4790-bbb6-445c19b34ee7-serving-cert" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.836015 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8fe4839d-cef4-4ec9-b146-2ae9b76d8a76" volumeName="kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-ca" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.836056 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9b61ea14-a7ea-49f3-9df4-5655765ddf7c" volumeName="kubernetes.io/configmap/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-config" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.836064 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09a22c25-6073-4b1a-a029-928452ef37db" volumeName="kubernetes.io/configmap/09a22c25-6073-4b1a-a029-928452ef37db-cni-binary-copy" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.836072 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="66f88242-8b0b-4790-bbb6-445c19b34ee7" volumeName="kubernetes.io/projected/66f88242-8b0b-4790-bbb6-445c19b34ee7-kube-api-access-p5fnx" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.836080 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="732989c5-1b89-46f0-9917-b68613f7f005" volumeName="kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-config" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.836120 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1" volumeName="kubernetes.io/projected/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-bound-sa-token" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.836128 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b3de8a1b-a5be-414f-86e8-738e16c8bc97" volumeName="kubernetes.io/configmap/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-trusted-ca" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.836137 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" volumeName="kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cni-sysctl-allowlist" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.836145 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="333047c4-aeca-410e-9393-ca4e74366921" volumeName="kubernetes.io/projected/333047c4-aeca-410e-9393-ca4e74366921-kube-api-access" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.836172 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a3ceeece-bee9-4fcb-8517-95ebce38e223" volumeName="kubernetes.io/secret/a3ceeece-bee9-4fcb-8517-95ebce38e223-serving-cert" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.836220 6932 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6611e325-6152-480c-9c2c-1b503e49ccd2" volumeName="kubernetes.io/secret/6611e325-6152-480c-9c2c-1b503e49ccd2-cluster-olm-operator-serving-cert" seLinuxMountContext="" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.836229 6932 reconstruct.go:97] "Volume reconstruction finished" Mar 19 11:52:55.839707 master-0 kubenswrapper[6932]: I0319 11:52:55.836236 6932 reconciler.go:26] "Reconciler: start to sync state" Mar 19 11:52:55.867465 master-0 kubenswrapper[6932]: I0319 11:52:55.867304 6932 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 19 11:52:55.869488 master-0 kubenswrapper[6932]: I0319 11:52:55.869390 6932 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 19 11:52:55.869488 master-0 kubenswrapper[6932]: I0319 11:52:55.869440 6932 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 19 11:52:55.869591 master-0 kubenswrapper[6932]: I0319 11:52:55.869529 6932 kubelet.go:2335] "Starting kubelet main sync loop" Mar 19 11:52:55.869669 master-0 kubenswrapper[6932]: E0319 11:52:55.869583 6932 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 19 11:52:55.871837 master-0 kubenswrapper[6932]: I0319 11:52:55.871806 6932 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 11:52:55.881929 master-0 kubenswrapper[6932]: I0319 11:52:55.881795 6932 generic.go:334] "Generic (PLEG): container finished" podID="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" containerID="ec3103cf568fabdd9da2c1fe1b486c6e0c444ae0adfa29f7784e8224f29d03a4" exitCode=0 Mar 19 11:52:55.881929 master-0 kubenswrapper[6932]: I0319 11:52:55.881835 6932 generic.go:334] "Generic (PLEG): container finished" podID="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" containerID="a882ec3e14e198707c095bc0bdd34381c81e4c1697293837f13c4fc402ee5b87" exitCode=0 Mar 19 11:52:55.881929 master-0 kubenswrapper[6932]: I0319 11:52:55.881847 6932 generic.go:334] "Generic (PLEG): container finished" podID="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" containerID="e6b2ecdeb98ba4579257a0e7e4159cee8c04ebbb886d532c90b2d6925d5996ab" exitCode=0 Mar 19 11:52:55.881929 master-0 kubenswrapper[6932]: I0319 11:52:55.881859 6932 generic.go:334] "Generic (PLEG): container finished" podID="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" containerID="081b4d6f699ceead2b8cddd228d7b6dc1383135b83134925db54e215e05a85df" exitCode=0 Mar 19 11:52:55.881929 master-0 kubenswrapper[6932]: I0319 11:52:55.881870 6932 generic.go:334] "Generic (PLEG): container finished" podID="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" containerID="ac2545e0b2dd4885511fea2e8cd975f1d1867cae6d7a8bfbf5aa8fba195a8d88" exitCode=0 Mar 19 11:52:55.881929 master-0 kubenswrapper[6932]: I0319 11:52:55.881878 6932 generic.go:334] "Generic (PLEG): container finished" podID="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" containerID="de0412fe0521ed4585e79b055942133d1bae28dd08d3cd77acada0e7dc47ebba" exitCode=0 Mar 19 11:52:55.887155 master-0 kubenswrapper[6932]: I0319 11:52:55.887106 6932 generic.go:334] "Generic (PLEG): container finished" podID="0121ab07-b504-4577-bb1b-fef929268726" containerID="7dae6204524503aef6defd496cb7b6d0917403d46739b0545f2e50058742fb7c" exitCode=0 Mar 19 11:52:55.890124 master-0 kubenswrapper[6932]: I0319 11:52:55.890074 6932 generic.go:334] "Generic (PLEG): container finished" podID="49fac1b46a11e49501805e891baae4a9" containerID="d3ab6ca62e19d2ef407ccf237743444ad88802357f607cafd2e5c6b8ac29d477" exitCode=0 Mar 19 11:52:55.907470 master-0 kubenswrapper[6932]: I0319 11:52:55.907425 6932 generic.go:334] "Generic (PLEG): container finished" podID="3053504d-0734-4def-b639-0f5cc2178185" containerID="3e8362d7d083774070cfab73695a0128d3b617dc47c3ad8cda98be3e5d078943" exitCode=0 Mar 19 11:52:55.911766 master-0 kubenswrapper[6932]: I0319 11:52:55.911716 6932 generic.go:334] "Generic (PLEG): container finished" podID="c13ffb3e-ab50-411c-9208-7ba47e8ebc92" containerID="d0afa60868b67a2bbb33777d6af8334fc696accf5659fb55479d8c7b865f745d" exitCode=0 Mar 19 11:52:55.927386 master-0 kubenswrapper[6932]: I0319 11:52:55.926492 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 19 11:52:55.927386 master-0 kubenswrapper[6932]: I0319 11:52:55.926994 6932 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="d4ec9f8652caf61956bb350585a200ee75b716b204eab89e8110dd9c8c54f2a5" exitCode=1 Mar 19 11:52:55.927386 master-0 kubenswrapper[6932]: I0319 11:52:55.927016 6932 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="5b728a95b5ae31dab98e905315ad7bc4e11c06682ed7961c2f8d666cf463933f" exitCode=0 Mar 19 11:52:55.945196 master-0 kubenswrapper[6932]: I0319 11:52:55.944842 6932 manager.go:324] Recovery completed Mar 19 11:52:55.970160 master-0 kubenswrapper[6932]: E0319 11:52:55.970075 6932 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 19 11:52:55.980855 master-0 kubenswrapper[6932]: I0319 11:52:55.980822 6932 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 19 11:52:55.980855 master-0 kubenswrapper[6932]: I0319 11:52:55.980846 6932 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 19 11:52:55.980988 master-0 kubenswrapper[6932]: I0319 11:52:55.980864 6932 state_mem.go:36] "Initialized new in-memory state store" Mar 19 11:52:55.981053 master-0 kubenswrapper[6932]: I0319 11:52:55.981029 6932 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 19 11:52:55.981094 master-0 kubenswrapper[6932]: I0319 11:52:55.981046 6932 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 19 11:52:55.981094 master-0 kubenswrapper[6932]: I0319 11:52:55.981067 6932 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Mar 19 11:52:55.981094 master-0 kubenswrapper[6932]: I0319 11:52:55.981074 6932 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Mar 19 11:52:55.981094 master-0 kubenswrapper[6932]: I0319 11:52:55.981081 6932 policy_none.go:49] "None policy: Start" Mar 19 11:52:55.982844 master-0 kubenswrapper[6932]: I0319 11:52:55.982825 6932 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 19 11:52:55.982944 master-0 kubenswrapper[6932]: I0319 11:52:55.982935 6932 state_mem.go:35] "Initializing new in-memory state store" Mar 19 11:52:55.983282 master-0 kubenswrapper[6932]: I0319 11:52:55.983272 6932 state_mem.go:75] "Updated machine memory state" Mar 19 11:52:55.983358 master-0 kubenswrapper[6932]: I0319 11:52:55.983350 6932 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Mar 19 11:52:55.994400 master-0 kubenswrapper[6932]: I0319 11:52:55.994352 6932 manager.go:334] "Starting Device Plugin manager" Mar 19 11:52:55.994477 master-0 kubenswrapper[6932]: I0319 11:52:55.994420 6932 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 19 11:52:55.994477 master-0 kubenswrapper[6932]: I0319 11:52:55.994456 6932 server.go:79] "Starting device plugin registration server" Mar 19 11:52:55.997514 master-0 kubenswrapper[6932]: I0319 11:52:55.997477 6932 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 19 11:52:55.997567 master-0 kubenswrapper[6932]: I0319 11:52:55.997504 6932 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 19 11:52:55.997999 master-0 kubenswrapper[6932]: I0319 11:52:55.997642 6932 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 19 11:52:55.997999 master-0 kubenswrapper[6932]: I0319 11:52:55.997795 6932 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 19 11:52:55.997999 master-0 kubenswrapper[6932]: I0319 11:52:55.997803 6932 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 19 11:52:56.100872 master-0 kubenswrapper[6932]: I0319 11:52:56.097896 6932 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:56.100872 master-0 kubenswrapper[6932]: I0319 11:52:56.099781 6932 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:56.100872 master-0 kubenswrapper[6932]: I0319 11:52:56.099822 6932 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:56.100872 master-0 kubenswrapper[6932]: I0319 11:52:56.099838 6932 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:56.100872 master-0 kubenswrapper[6932]: I0319 11:52:56.099866 6932 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:52:56.171448 master-0 kubenswrapper[6932]: I0319 11:52:56.171280 6932 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 19 11:52:56.172285 master-0 kubenswrapper[6932]: I0319 11:52:56.172199 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"b9695d95c55ced36d31ee4b3802610d675e3206471662e3165ad086a92a3332c"} Mar 19 11:52:56.172444 master-0 kubenswrapper[6932]: I0319 11:52:56.172427 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"34e072f72c93d6369874c6beffaed27fe7a497ddbd4993eb86f92f576e79b6ab"} Mar 19 11:52:56.172550 master-0 kubenswrapper[6932]: I0319 11:52:56.172534 6932 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fbc1c222e9490f91d6a952eefdb3b6b0e27ef8c0528bce673e1da480e2d8f19" Mar 19 11:52:56.172620 master-0 kubenswrapper[6932]: I0319 11:52:56.172606 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"39c756c5e9204811d8c83cfa45ff7447029413f92b87a61b82da1dc41e1a076d"} Mar 19 11:52:56.172710 master-0 kubenswrapper[6932]: I0319 11:52:56.172693 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"21c17e15f1723f8eb75ec60f42ebd73c793697e640249886764928c881dbaaa1"} Mar 19 11:52:56.172814 master-0 kubenswrapper[6932]: I0319 11:52:56.172799 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerDied","Data":"d3ab6ca62e19d2ef407ccf237743444ad88802357f607cafd2e5c6b8ac29d477"} Mar 19 11:52:56.172873 master-0 kubenswrapper[6932]: I0319 11:52:56.172860 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"cc43d8901b61b03f0bff74bea5349f358784d720e2984f56ccc961dc3f630856"} Mar 19 11:52:56.172969 master-0 kubenswrapper[6932]: I0319 11:52:56.172950 6932 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac789a38dd13a595ac15b76c4a526da0f4604507ea8e8d54ee1ca913a0fc96b9" Mar 19 11:52:56.173216 master-0 kubenswrapper[6932]: I0319 11:52:56.173202 6932 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19faef5336e0e62090140de4619f79a9e64f33712b5b8e70590e04d8b85ea93f" Mar 19 11:52:56.173295 master-0 kubenswrapper[6932]: I0319 11:52:56.173282 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"901ed10fec5e9417fcd7522a27f15f9a949e9c0dd2ab8e429fd9b30afd0247bf"} Mar 19 11:52:56.173352 master-0 kubenswrapper[6932]: I0319 11:52:56.173340 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"3ca871a2e4c187593092b1e6a4a9637d7435e4628b01bcadfea7c6a9560eeb21"} Mar 19 11:52:56.173410 master-0 kubenswrapper[6932]: I0319 11:52:56.173399 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"6ecb192a1cfeb4529102ad33aeed1229502ac0d4a0688a01c8e90bffa6cdc39c"} Mar 19 11:52:56.175439 master-0 kubenswrapper[6932]: I0319 11:52:56.174780 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"58cc59848776f9368dd32da99bd6c9b9284f95df012df470d98ae16fe81785f6"} Mar 19 11:52:56.175439 master-0 kubenswrapper[6932]: I0319 11:52:56.174800 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"c21d5cdcf33dc5445d398db5efae2e61668498b313fd2a8200f2011b9857d1d4"} Mar 19 11:52:56.175439 master-0 kubenswrapper[6932]: I0319 11:52:56.174816 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"0252eb9b3a6c4d52db4e7759da29168fb6757ff67b4995374ebfa16c86b93541"} Mar 19 11:52:56.175439 master-0 kubenswrapper[6932]: I0319 11:52:56.174826 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"d4ec9f8652caf61956bb350585a200ee75b716b204eab89e8110dd9c8c54f2a5"} Mar 19 11:52:56.175439 master-0 kubenswrapper[6932]: I0319 11:52:56.174839 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"5b728a95b5ae31dab98e905315ad7bc4e11c06682ed7961c2f8d666cf463933f"} Mar 19 11:52:56.175439 master-0 kubenswrapper[6932]: I0319 11:52:56.174848 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"c7fce19a33a5dd46ce06e3ec2001f8aae0d2c521be7c2647e59448b0833408c9"} Mar 19 11:52:56.812531 master-0 kubenswrapper[6932]: I0319 11:52:56.812032 6932 apiserver.go:52] "Watching apiserver" Mar 19 11:52:56.823204 master-0 kubenswrapper[6932]: I0319 11:52:56.823103 6932 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 11:52:56.824070 master-0 kubenswrapper[6932]: I0319 11:52:56.824026 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc","openshift-network-operator/iptables-alerter-n52gc","openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9","openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w","openshift-multus/multus-552pc","openshift-multus/multus-additional-cni-plugins-n8vwk","openshift-network-node-identity/network-node-identity-j528w","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-marketplace/marketplace-operator-89ccd998f-bftt4","openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz","kube-system/bootstrap-kube-controller-manager-master-0","openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94","openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk","openshift-cluster-version/cluster-version-operator-56d8475767-pk574","openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h","openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq","openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd","openshift-ovn-kubernetes/ovnkube-node-4qxkd","assisted-installer/assisted-installer-controller-48bcp","openshift-network-diagnostics/network-check-target-cr8n7","kube-system/bootstrap-kube-scheduler-master-0","openshift-network-operator/network-operator-7bd846bfc4-7fz6w","openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd","openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm","openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9","openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-fx8ng","openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss","openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n","openshift-multus/network-metrics-daemon-f6wv7","openshift-dns-operator/dns-operator-9c5679d8f-965np","openshift-etcd/etcd-master-0-master-0","openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2","openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh"] Mar 19 11:52:56.824277 master-0 kubenswrapper[6932]: I0319 11:52:56.824253 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-48bcp" Mar 19 11:52:56.825305 master-0 kubenswrapper[6932]: I0319 11:52:56.824548 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:52:56.825305 master-0 kubenswrapper[6932]: I0319 11:52:56.824565 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:52:56.825305 master-0 kubenswrapper[6932]: I0319 11:52:56.824549 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:52:56.828755 master-0 kubenswrapper[6932]: I0319 11:52:56.826104 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:56.828755 master-0 kubenswrapper[6932]: I0319 11:52:56.827097 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 11:52:56.829211 master-0 kubenswrapper[6932]: I0319 11:52:56.829153 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:56.829762 master-0 kubenswrapper[6932]: I0319 11:52:56.829703 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:52:56.830458 master-0 kubenswrapper[6932]: I0319 11:52:56.830420 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:52:56.831829 master-0 kubenswrapper[6932]: I0319 11:52:56.831808 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:56.832228 master-0 kubenswrapper[6932]: I0319 11:52:56.832193 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 11:52:56.833090 master-0 kubenswrapper[6932]: I0319 11:52:56.833037 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:52:56.841959 master-0 kubenswrapper[6932]: I0319 11:52:56.841896 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:52:56.841959 master-0 kubenswrapper[6932]: I0319 11:52:56.841959 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:56.842474 master-0 kubenswrapper[6932]: I0319 11:52:56.842439 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 11:52:56.842522 master-0 kubenswrapper[6932]: I0319 11:52:56.842507 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 11:52:56.842550 master-0 kubenswrapper[6932]: I0319 11:52:56.842530 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 11:52:56.842550 master-0 kubenswrapper[6932]: I0319 11:52:56.842543 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:52:56.842617 master-0 kubenswrapper[6932]: I0319 11:52:56.842569 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 11:52:56.842617 master-0 kubenswrapper[6932]: I0319 11:52:56.842591 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 11:52:56.842831 master-0 kubenswrapper[6932]: I0319 11:52:56.842801 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 11:52:56.842877 master-0 kubenswrapper[6932]: I0319 11:52:56.842859 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 11:52:56.843014 master-0 kubenswrapper[6932]: I0319 11:52:56.842990 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 19 11:52:56.843063 master-0 kubenswrapper[6932]: I0319 11:52:56.843041 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 11:52:56.843108 master-0 kubenswrapper[6932]: I0319 11:52:56.843090 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 19 11:52:56.843270 master-0 kubenswrapper[6932]: I0319 11:52:56.843250 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 11:52:56.843315 master-0 kubenswrapper[6932]: I0319 11:52:56.843302 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 19 11:52:56.843349 master-0 kubenswrapper[6932]: I0319 11:52:56.843323 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 19 11:52:56.843349 master-0 kubenswrapper[6932]: I0319 11:52:56.843336 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 11:52:56.843399 master-0 kubenswrapper[6932]: I0319 11:52:56.843370 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 19 11:52:56.843444 master-0 kubenswrapper[6932]: I0319 11:52:56.843425 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 11:52:56.843501 master-0 kubenswrapper[6932]: I0319 11:52:56.843482 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 11:52:56.843546 master-0 kubenswrapper[6932]: I0319 11:52:56.843519 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 11:52:56.843546 master-0 kubenswrapper[6932]: I0319 11:52:56.843546 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:56.843664 master-0 kubenswrapper[6932]: I0319 11:52:56.843641 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 19 11:52:56.843988 master-0 kubenswrapper[6932]: I0319 11:52:56.843956 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 19 11:52:56.844132 master-0 kubenswrapper[6932]: I0319 11:52:56.844107 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 11:52:56.844309 master-0 kubenswrapper[6932]: I0319 11:52:56.844281 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 11:52:56.844309 master-0 kubenswrapper[6932]: I0319 11:52:56.844300 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 11:52:56.844389 master-0 kubenswrapper[6932]: I0319 11:52:56.844306 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 11:52:56.844389 master-0 kubenswrapper[6932]: I0319 11:52:56.844356 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 19 11:52:56.844389 master-0 kubenswrapper[6932]: I0319 11:52:56.844367 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 11:52:56.844474 master-0 kubenswrapper[6932]: I0319 11:52:56.844400 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 11:52:56.845201 master-0 kubenswrapper[6932]: I0319 11:52:56.845165 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 11:52:56.845480 master-0 kubenswrapper[6932]: I0319 11:52:56.845407 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 11:52:56.845645 master-0 kubenswrapper[6932]: I0319 11:52:56.845553 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 19 11:52:56.845645 master-0 kubenswrapper[6932]: I0319 11:52:56.845644 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 11:52:56.846026 master-0 kubenswrapper[6932]: I0319 11:52:56.846002 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 11:52:56.846752 master-0 kubenswrapper[6932]: I0319 11:52:56.846707 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 11:52:56.847180 master-0 kubenswrapper[6932]: I0319 11:52:56.847155 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 11:52:56.847494 master-0 kubenswrapper[6932]: I0319 11:52:56.847270 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 11:52:56.847586 master-0 kubenswrapper[6932]: I0319 11:52:56.847563 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 11:52:56.849344 master-0 kubenswrapper[6932]: I0319 11:52:56.849289 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 11:52:56.851267 master-0 kubenswrapper[6932]: I0319 11:52:56.850955 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 11:52:56.851764 master-0 kubenswrapper[6932]: I0319 11:52:56.851745 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 11:52:56.851852 master-0 kubenswrapper[6932]: I0319 11:52:56.851578 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 11:52:56.852427 master-0 kubenswrapper[6932]: I0319 11:52:56.852409 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 11:52:56.852513 master-0 kubenswrapper[6932]: I0319 11:52:56.852482 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 11:52:56.852588 master-0 kubenswrapper[6932]: I0319 11:52:56.852567 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 11:52:56.852588 master-0 kubenswrapper[6932]: I0319 11:52:56.852484 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 11:52:56.852858 master-0 kubenswrapper[6932]: I0319 11:52:56.852799 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 11:52:56.853978 master-0 kubenswrapper[6932]: I0319 11:52:56.853964 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 11:52:56.854207 master-0 kubenswrapper[6932]: I0319 11:52:56.854194 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 11:52:56.859934 master-0 kubenswrapper[6932]: I0319 11:52:56.859906 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 19 11:52:56.860282 master-0 kubenswrapper[6932]: I0319 11:52:56.860009 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 11:52:56.860370 master-0 kubenswrapper[6932]: I0319 11:52:56.860011 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 11:52:56.860426 master-0 kubenswrapper[6932]: I0319 11:52:56.860119 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 19 11:52:56.860457 master-0 kubenswrapper[6932]: I0319 11:52:56.860133 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 11:52:56.860509 master-0 kubenswrapper[6932]: I0319 11:52:56.860195 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 11:52:56.860768 master-0 kubenswrapper[6932]: I0319 11:52:56.860741 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 11:52:56.861005 master-0 kubenswrapper[6932]: I0319 11:52:56.860990 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 11:52:56.861191 master-0 kubenswrapper[6932]: I0319 11:52:56.861166 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 11:52:56.861238 master-0 kubenswrapper[6932]: I0319 11:52:56.861203 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 11:52:56.861335 master-0 kubenswrapper[6932]: I0319 11:52:56.861292 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 11:52:56.861498 master-0 kubenswrapper[6932]: I0319 11:52:56.861477 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 11:52:56.861585 master-0 kubenswrapper[6932]: I0319 11:52:56.861566 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 11:52:56.862167 master-0 kubenswrapper[6932]: I0319 11:52:56.861571 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 11:52:56.865089 master-0 kubenswrapper[6932]: I0319 11:52:56.865054 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 11:52:56.865647 master-0 kubenswrapper[6932]: I0319 11:52:56.865615 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 11:52:56.867573 master-0 kubenswrapper[6932]: I0319 11:52:56.867544 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 11:52:56.867718 master-0 kubenswrapper[6932]: I0319 11:52:56.867695 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 11:52:56.868291 master-0 kubenswrapper[6932]: I0319 11:52:56.868259 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 11:52:56.868291 master-0 kubenswrapper[6932]: I0319 11:52:56.868284 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 11:52:56.868384 master-0 kubenswrapper[6932]: I0319 11:52:56.868340 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 11:52:56.868473 master-0 kubenswrapper[6932]: I0319 11:52:56.868439 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 11:52:56.869421 master-0 kubenswrapper[6932]: I0319 11:52:56.869388 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 11:52:56.869467 master-0 kubenswrapper[6932]: I0319 11:52:56.869425 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 11:52:56.869559 master-0 kubenswrapper[6932]: I0319 11:52:56.869511 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 11:52:56.869559 master-0 kubenswrapper[6932]: I0319 11:52:56.869550 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 11:52:56.869613 master-0 kubenswrapper[6932]: I0319 11:52:56.869589 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 11:52:56.869648 master-0 kubenswrapper[6932]: I0319 11:52:56.869619 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 11:52:56.869648 master-0 kubenswrapper[6932]: I0319 11:52:56.869540 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 11:52:56.869698 master-0 kubenswrapper[6932]: I0319 11:52:56.869677 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 11:52:56.869767 master-0 kubenswrapper[6932]: I0319 11:52:56.869741 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 11:52:56.869803 master-0 kubenswrapper[6932]: I0319 11:52:56.869787 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 11:52:56.869831 master-0 kubenswrapper[6932]: I0319 11:52:56.869795 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 19 11:52:56.869831 master-0 kubenswrapper[6932]: I0319 11:52:56.869752 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 11:52:56.869831 master-0 kubenswrapper[6932]: I0319 11:52:56.869688 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 11:52:56.869923 master-0 kubenswrapper[6932]: I0319 11:52:56.869799 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 11:52:56.869923 master-0 kubenswrapper[6932]: I0319 11:52:56.869799 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 11:52:56.869923 master-0 kubenswrapper[6932]: I0319 11:52:56.869903 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 11:52:56.869992 master-0 kubenswrapper[6932]: I0319 11:52:56.869852 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 11:52:56.869992 master-0 kubenswrapper[6932]: I0319 11:52:56.869983 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 19 11:52:56.870045 master-0 kubenswrapper[6932]: I0319 11:52:56.869993 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 11:52:56.870094 master-0 kubenswrapper[6932]: I0319 11:52:56.870077 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 11:52:56.870205 master-0 kubenswrapper[6932]: I0319 11:52:56.870179 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 11:52:56.870470 master-0 kubenswrapper[6932]: I0319 11:52:56.870435 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 11:52:56.870804 master-0 kubenswrapper[6932]: I0319 11:52:56.870771 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 11:52:56.871210 master-0 kubenswrapper[6932]: I0319 11:52:56.871161 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 11:52:56.871424 master-0 kubenswrapper[6932]: I0319 11:52:56.871413 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 19 11:52:56.871638 master-0 kubenswrapper[6932]: I0319 11:52:56.871602 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 11:52:56.872370 master-0 kubenswrapper[6932]: I0319 11:52:56.872131 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 19 11:52:56.878983 master-0 kubenswrapper[6932]: I0319 11:52:56.878942 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 11:52:56.880454 master-0 kubenswrapper[6932]: I0319 11:52:56.880369 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 11:52:56.882399 master-0 kubenswrapper[6932]: I0319 11:52:56.882349 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 11:52:56.894789 master-0 kubenswrapper[6932]: I0319 11:52:56.894720 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 11:52:56.914714 master-0 kubenswrapper[6932]: I0319 11:52:56.914665 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 11:52:56.932409 master-0 kubenswrapper[6932]: I0319 11:52:56.932356 6932 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 19 11:52:56.934715 master-0 kubenswrapper[6932]: I0319 11:52:56.934685 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 11:52:57.755129 master-0 kubenswrapper[6932]: I0319 11:52:57.755070 6932 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 19 11:52:57.856165 master-0 kubenswrapper[6932]: I0319 11:52:57.856103 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-os-release\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:52:57.856679 master-0 kubenswrapper[6932]: I0319 11:52:57.856192 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:57.856679 master-0 kubenswrapper[6932]: I0319 11:52:57.856225 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/daf4dbb6-5a0a-4c92-a930-479a7330ace1-env-overrides\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:52:57.856679 master-0 kubenswrapper[6932]: I0319 11:52:57.856248 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:52:57.856679 master-0 kubenswrapper[6932]: I0319 11:52:57.856265 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8438d015-106b-4aed-ae12-dda781ce51fc-webhook-cert\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:52:57.856679 master-0 kubenswrapper[6932]: I0319 11:52:57.856285 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:52:57.856679 master-0 kubenswrapper[6932]: I0319 11:52:57.856312 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkm97\" (UniqueName: \"kubernetes.io/projected/cf08ab4f-c203-4c16-9826-8cc049f4af31-kube-api-access-lkm97\") pod \"catalog-operator-68f85b4d6c-n5gr9\" (UID: \"cf08ab4f-c203-4c16-9826-8cc049f4af31\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:52:57.856679 master-0 kubenswrapper[6932]: I0319 11:52:57.856330 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3c3b0d24-ce5e-49c3-a546-874356f75dc6-host-etc-kube\") pod \"network-operator-7bd846bfc4-7fz6w\" (UID: \"3c3b0d24-ce5e-49c3-a546-874356f75dc6\") " pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" Mar 19 11:52:57.856679 master-0 kubenswrapper[6932]: I0319 11:52:57.856359 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-var-lib-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.856679 master-0 kubenswrapper[6932]: I0319 11:52:57.856377 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/333047c4-aeca-410e-9393-ca4e74366921-kube-api-access\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:52:57.856679 master-0 kubenswrapper[6932]: I0319 11:52:57.856395 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3ceeece-bee9-4fcb-8517-95ebce38e223-serving-cert\") pod \"openshift-config-operator-95bf4f4d-ng9ss\" (UID: \"a3ceeece-bee9-4fcb-8517-95ebce38e223\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:52:57.856679 master-0 kubenswrapper[6932]: I0319 11:52:57.856411 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-config\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:57.856679 master-0 kubenswrapper[6932]: I0319 11:52:57.856425 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/8438d015-106b-4aed-ae12-dda781ce51fc-ovnkube-identity-cm\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:52:57.856679 master-0 kubenswrapper[6932]: I0319 11:52:57.856441 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:52:57.856679 master-0 kubenswrapper[6932]: I0319 11:52:57.856457 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:52:57.856679 master-0 kubenswrapper[6932]: I0319 11:52:57.856471 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:52:57.856679 master-0 kubenswrapper[6932]: I0319 11:52:57.856487 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/163d6a3d-0080-4122-bb7a-17f6e63f00f0-trusted-ca\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:57.856679 master-0 kubenswrapper[6932]: I0319 11:52:57.856504 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46m89\" (UniqueName: \"kubernetes.io/projected/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-kube-api-access-46m89\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:57.856679 master-0 kubenswrapper[6932]: I0319 11:52:57.856525 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5fnx\" (UniqueName: \"kubernetes.io/projected/66f88242-8b0b-4790-bbb6-445c19b34ee7-kube-api-access-p5fnx\") pod \"openshift-apiserver-operator-d65958b8-6hsqn\" (UID: \"66f88242-8b0b-4790-bbb6-445c19b34ee7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" Mar 19 11:52:57.856679 master-0 kubenswrapper[6932]: I0319 11:52:57.856539 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8438d015-106b-4aed-ae12-dda781ce51fc-env-overrides\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:52:57.856679 master-0 kubenswrapper[6932]: I0319 11:52:57.856559 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-config\") pod \"openshift-controller-manager-operator-8c94f4649-6ghdm\" (UID: \"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" Mar 19 11:52:57.856679 master-0 kubenswrapper[6932]: I0319 11:52:57.856580 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-conf-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.856679 master-0 kubenswrapper[6932]: I0319 11:52:57.856602 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-kubelet\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.856679 master-0 kubenswrapper[6932]: I0319 11:52:57.856639 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-hostroot\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.856679 master-0 kubenswrapper[6932]: I0319 11:52:57.856663 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-iptables-alerter-script\") pod \"iptables-alerter-n52gc\" (UID: \"4e2c195f-e97d-4cac-81fc-2d5c551d1c30\") " pod="openshift-network-operator/iptables-alerter-n52gc" Mar 19 11:52:57.856679 master-0 kubenswrapper[6932]: I0319 11:52:57.856683 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-slash\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.857350 master-0 kubenswrapper[6932]: I0319 11:52:57.856702 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-ovnkube-config\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.857350 master-0 kubenswrapper[6932]: I0319 11:52:57.856740 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls\") pod \"dns-operator-9c5679d8f-965np\" (UID: \"22e10648-af7c-409e-b947-570e7d807e05\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:52:57.857350 master-0 kubenswrapper[6932]: I0319 11:52:57.856757 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert\") pod \"olm-operator-5c9796789-l9sw9\" (UID: \"716c2176-50f9-4c4f-af0e-4c7973457df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:52:57.857350 master-0 kubenswrapper[6932]: I0319 11:52:57.856774 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbcbba74-ac53-4724-a217-4d9b85e7c1db-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-5gvgh\" (UID: \"dbcbba74-ac53-4724-a217-4d9b85e7c1db\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" Mar 19 11:52:57.857350 master-0 kubenswrapper[6932]: I0319 11:52:57.856790 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-os-release\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.857350 master-0 kubenswrapper[6932]: I0319 11:52:57.856806 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/daf4dbb6-5a0a-4c92-a930-479a7330ace1-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:52:57.857350 master-0 kubenswrapper[6932]: I0319 11:52:57.856823 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3c3b0d24-ce5e-49c3-a546-874356f75dc6-metrics-tls\") pod \"network-operator-7bd846bfc4-7fz6w\" (UID: \"3c3b0d24-ce5e-49c3-a546-874356f75dc6\") " pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" Mar 19 11:52:57.857350 master-0 kubenswrapper[6932]: I0319 11:52:57.856863 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-4wj9n\" (UID: \"9b61ea14-a7ea-49f3-9df4-5655765ddf7c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" Mar 19 11:52:57.857350 master-0 kubenswrapper[6932]: I0319 11:52:57.856930 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:57.857909 master-0 kubenswrapper[6932]: I0319 11:52:57.857413 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-ovnkube-config\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.857909 master-0 kubenswrapper[6932]: I0319 11:52:57.857456 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/8438d015-106b-4aed-ae12-dda781ce51fc-ovnkube-identity-cm\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:52:57.858064 master-0 kubenswrapper[6932]: I0319 11:52:57.858014 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8438d015-106b-4aed-ae12-dda781ce51fc-webhook-cert\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:52:57.859478 master-0 kubenswrapper[6932]: I0319 11:52:57.859446 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:52:57.859879 master-0 kubenswrapper[6932]: I0319 11:52:57.859855 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-config\") pod \"openshift-controller-manager-operator-8c94f4649-6ghdm\" (UID: \"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" Mar 19 11:52:57.860049 master-0 kubenswrapper[6932]: I0319 11:52:57.860015 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8438d015-106b-4aed-ae12-dda781ce51fc-env-overrides\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:52:57.860258 master-0 kubenswrapper[6932]: I0319 11:52:57.860202 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-iptables-alerter-script\") pod \"iptables-alerter-n52gc\" (UID: \"4e2c195f-e97d-4cac-81fc-2d5c551d1c30\") " pod="openshift-network-operator/iptables-alerter-n52gc" Mar 19 11:52:57.860305 master-0 kubenswrapper[6932]: I0319 11:52:57.860263 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-config\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:57.860341 master-0 kubenswrapper[6932]: I0319 11:52:57.860307 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3ceeece-bee9-4fcb-8517-95ebce38e223-serving-cert\") pod \"openshift-config-operator-95bf4f4d-ng9ss\" (UID: \"a3ceeece-bee9-4fcb-8517-95ebce38e223\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:52:57.861259 master-0 kubenswrapper[6932]: I0319 11:52:57.861221 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/163d6a3d-0080-4122-bb7a-17f6e63f00f0-trusted-ca\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:57.861880 master-0 kubenswrapper[6932]: I0319 11:52:57.861853 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/daf4dbb6-5a0a-4c92-a930-479a7330ace1-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:52:57.862107 master-0 kubenswrapper[6932]: I0319 11:52:57.862085 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3c3b0d24-ce5e-49c3-a546-874356f75dc6-metrics-tls\") pod \"network-operator-7bd846bfc4-7fz6w\" (UID: \"3c3b0d24-ce5e-49c3-a546-874356f75dc6\") " pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" Mar 19 11:52:57.862194 master-0 kubenswrapper[6932]: I0319 11:52:57.862173 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/732989c5-1b89-46f0-9917-b68613f7f005-serving-cert\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:57.862229 master-0 kubenswrapper[6932]: I0319 11:52:57.862177 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:57.862229 master-0 kubenswrapper[6932]: I0319 11:52:57.862216 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbcbba74-ac53-4724-a217-4d9b85e7c1db-config\") pod \"kube-controller-manager-operator-ff989d6cc-5gvgh\" (UID: \"dbcbba74-ac53-4724-a217-4d9b85e7c1db\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" Mar 19 11:52:57.862288 master-0 kubenswrapper[6932]: I0319 11:52:57.862258 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w\" (UID: \"b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" Mar 19 11:52:57.862479 master-0 kubenswrapper[6932]: I0319 11:52:57.862458 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/daf4dbb6-5a0a-4c92-a930-479a7330ace1-env-overrides\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:52:57.862550 master-0 kubenswrapper[6932]: I0319 11:52:57.862526 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w\" (UID: \"b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" Mar 19 11:52:57.862613 master-0 kubenswrapper[6932]: I0319 11:52:57.862592 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:52:57.862613 master-0 kubenswrapper[6932]: I0319 11:52:57.862607 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/732989c5-1b89-46f0-9917-b68613f7f005-serving-cert\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:57.862668 master-0 kubenswrapper[6932]: I0319 11:52:57.862639 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3053504d-0734-4def-b639-0f5cc2178185-ovn-node-metrics-cert\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.862694 master-0 kubenswrapper[6932]: I0319 11:52:57.862677 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:57.862778 master-0 kubenswrapper[6932]: I0319 11:52:57.862714 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/6611e325-6152-480c-9c2c-1b503e49ccd2-operand-assets\") pod \"cluster-olm-operator-67dcd4998-rgbzk\" (UID: \"6611e325-6152-480c-9c2c-1b503e49ccd2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" Mar 19 11:52:57.862858 master-0 kubenswrapper[6932]: I0319 11:52:57.862837 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/6611e325-6152-480c-9c2c-1b503e49ccd2-operand-assets\") pod \"cluster-olm-operator-67dcd4998-rgbzk\" (UID: \"6611e325-6152-480c-9c2c-1b503e49ccd2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" Mar 19 11:52:57.863009 master-0 kubenswrapper[6932]: I0319 11:52:57.862989 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbcbba74-ac53-4724-a217-4d9b85e7c1db-config\") pod \"kube-controller-manager-operator-ff989d6cc-5gvgh\" (UID: \"dbcbba74-ac53-4724-a217-4d9b85e7c1db\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" Mar 19 11:52:57.863367 master-0 kubenswrapper[6932]: I0319 11:52:57.863333 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:57.863401 master-0 kubenswrapper[6932]: I0319 11:52:57.863374 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3053504d-0734-4def-b639-0f5cc2178185-ovn-node-metrics-cert\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.863435 master-0 kubenswrapper[6932]: I0319 11:52:57.863402 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/333047c4-aeca-410e-9393-ca4e74366921-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:52:57.863516 master-0 kubenswrapper[6932]: I0319 11:52:57.863463 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/333047c4-aeca-410e-9393-ca4e74366921-service-ca\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:52:57.863591 master-0 kubenswrapper[6932]: I0319 11:52:57.863561 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cnibin\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:52:57.863784 master-0 kubenswrapper[6932]: I0319 11:52:57.863713 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-system-cni-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.863784 master-0 kubenswrapper[6932]: I0319 11:52:57.863761 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09a22c25-6073-4b1a-a029-928452ef37db-cni-binary-copy\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.863784 master-0 kubenswrapper[6932]: I0319 11:52:57.863763 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/333047c4-aeca-410e-9393-ca4e74366921-service-ca\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:52:57.863969 master-0 kubenswrapper[6932]: I0319 11:52:57.863835 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:52:57.863969 master-0 kubenswrapper[6932]: I0319 11:52:57.863884 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:52:57.863969 master-0 kubenswrapper[6932]: I0319 11:52:57.863935 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nds54\" (UniqueName: \"kubernetes.io/projected/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-kube-api-access-nds54\") pod \"openshift-controller-manager-operator-8c94f4649-6ghdm\" (UID: \"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" Mar 19 11:52:57.863969 master-0 kubenswrapper[6932]: I0319 11:52:57.863946 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09a22c25-6073-4b1a-a029-928452ef37db-cni-binary-copy\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.864070 master-0 kubenswrapper[6932]: I0319 11:52:57.864009 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert\") pod \"catalog-operator-68f85b4d6c-n5gr9\" (UID: \"cf08ab4f-c203-4c16-9826-8cc049f4af31\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:52:57.864098 master-0 kubenswrapper[6932]: I0319 11:52:57.864071 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:52:57.864372 master-0 kubenswrapper[6932]: I0319 11:52:57.864349 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79qrb\" (UniqueName: \"kubernetes.io/projected/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-kube-api-access-79qrb\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:52:57.864426 master-0 kubenswrapper[6932]: I0319 11:52:57.864383 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bb2x\" (UniqueName: \"kubernetes.io/projected/3053504d-0734-4def-b639-0f5cc2178185-kube-api-access-2bb2x\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.864426 master-0 kubenswrapper[6932]: I0319 11:52:57.864403 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:52:57.864483 master-0 kubenswrapper[6932]: I0319 11:52:57.864425 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:57.864483 master-0 kubenswrapper[6932]: I0319 11:52:57.864447 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:57.864483 master-0 kubenswrapper[6932]: I0319 11:52:57.864469 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-jq5vq\" (UID: \"e5078f17-bc65-460f-9f18-8c506db6840b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:52:57.864556 master-0 kubenswrapper[6932]: I0319 11:52:57.864490 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7tc5\" (UniqueName: \"kubernetes.io/projected/163d6a3d-0080-4122-bb7a-17f6e63f00f0-kube-api-access-m7tc5\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:57.864556 master-0 kubenswrapper[6932]: I0319 11:52:57.864507 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66f88242-8b0b-4790-bbb6-445c19b34ee7-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-6hsqn\" (UID: \"66f88242-8b0b-4790-bbb6-445c19b34ee7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" Mar 19 11:52:57.864612 master-0 kubenswrapper[6932]: I0319 11:52:57.864559 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-systemd-units\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.864612 master-0 kubenswrapper[6932]: I0319 11:52:57.864582 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-env-overrides\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.864612 master-0 kubenswrapper[6932]: I0319 11:52:57.864608 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:52:57.865035 master-0 kubenswrapper[6932]: I0319 11:52:57.864628 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wls49\" (UniqueName: \"kubernetes.io/projected/22e10648-af7c-409e-b947-570e7d807e05-kube-api-access-wls49\") pod \"dns-operator-9c5679d8f-965np\" (UID: \"22e10648-af7c-409e-b947-570e7d807e05\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:52:57.865035 master-0 kubenswrapper[6932]: I0319 11:52:57.864653 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgz7q\" (UniqueName: \"kubernetes.io/projected/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-kube-api-access-kgz7q\") pod \"iptables-alerter-n52gc\" (UID: \"4e2c195f-e97d-4cac-81fc-2d5c551d1c30\") " pod="openshift-network-operator/iptables-alerter-n52gc" Mar 19 11:52:57.865035 master-0 kubenswrapper[6932]: I0319 11:52:57.864674 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-run-netns\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.865035 master-0 kubenswrapper[6932]: I0319 11:52:57.864699 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-run-ovn-kubernetes\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.865035 master-0 kubenswrapper[6932]: I0319 11:52:57.864718 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-serving-cert\") pod \"service-ca-operator-b865698dc-md7m5\" (UID: \"f5d73fef-1414-4b29-97ea-42e1c0b1ef18\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" Mar 19 11:52:57.865035 master-0 kubenswrapper[6932]: I0319 11:52:57.864752 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/333047c4-aeca-410e-9393-ca4e74366921-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:52:57.865035 master-0 kubenswrapper[6932]: I0319 11:52:57.864776 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-kubelet\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.865035 master-0 kubenswrapper[6932]: I0319 11:52:57.864804 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7spvn\" (UniqueName: \"kubernetes.io/projected/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-kube-api-access-7spvn\") pod \"multus-admission-controller-5dbbb8b86f-wdwkz\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:52:57.865035 master-0 kubenswrapper[6932]: I0319 11:52:57.864861 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cni-binary-copy\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:52:57.865035 master-0 kubenswrapper[6932]: I0319 11:52:57.864883 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-multus-certs\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.865035 master-0 kubenswrapper[6932]: I0319 11:52:57.864909 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a3ceeece-bee9-4fcb-8517-95ebce38e223-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-ng9ss\" (UID: \"a3ceeece-bee9-4fcb-8517-95ebce38e223\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:52:57.865973 master-0 kubenswrapper[6932]: I0319 11:52:57.865348 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:57.865973 master-0 kubenswrapper[6932]: I0319 11:52:57.865528 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:57.865973 master-0 kubenswrapper[6932]: I0319 11:52:57.865644 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-config\") pod \"openshift-kube-scheduler-operator-dddff6458-4wj9n\" (UID: \"9b61ea14-a7ea-49f3-9df4-5655765ddf7c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" Mar 19 11:52:57.865973 master-0 kubenswrapper[6932]: I0319 11:52:57.865759 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a3ceeece-bee9-4fcb-8517-95ebce38e223-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-ng9ss\" (UID: \"a3ceeece-bee9-4fcb-8517-95ebce38e223\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:52:57.865973 master-0 kubenswrapper[6932]: I0319 11:52:57.865813 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfvz6\" (UniqueName: \"kubernetes.io/projected/732989c5-1b89-46f0-9917-b68613f7f005-kube-api-access-bfvz6\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:57.865973 master-0 kubenswrapper[6932]: I0319 11:52:57.865836 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-client\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:57.865973 master-0 kubenswrapper[6932]: I0319 11:52:57.865885 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:52:57.866256 master-0 kubenswrapper[6932]: I0319 11:52:57.866028 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66f88242-8b0b-4790-bbb6-445c19b34ee7-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-6hsqn\" (UID: \"66f88242-8b0b-4790-bbb6-445c19b34ee7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" Mar 19 11:52:57.866256 master-0 kubenswrapper[6932]: I0319 11:52:57.866124 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:52:57.866256 master-0 kubenswrapper[6932]: I0319 11:52:57.866199 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-config\") pod \"openshift-kube-scheduler-operator-dddff6458-4wj9n\" (UID: \"9b61ea14-a7ea-49f3-9df4-5655765ddf7c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" Mar 19 11:52:57.866445 master-0 kubenswrapper[6932]: I0319 11:52:57.866269 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx4wk\" (UniqueName: \"kubernetes.io/projected/09a22c25-6073-4b1a-a029-928452ef37db-kube-api-access-xx4wk\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.866445 master-0 kubenswrapper[6932]: I0319 11:52:57.866271 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-serving-cert\") pod \"service-ca-operator-b865698dc-md7m5\" (UID: \"f5d73fef-1414-4b29-97ea-42e1c0b1ef18\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" Mar 19 11:52:57.866445 master-0 kubenswrapper[6932]: I0319 11:52:57.866297 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-env-overrides\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.866445 master-0 kubenswrapper[6932]: I0319 11:52:57.866357 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-host-slash\") pod \"iptables-alerter-n52gc\" (UID: \"4e2c195f-e97d-4cac-81fc-2d5c551d1c30\") " pod="openshift-network-operator/iptables-alerter-n52gc" Mar 19 11:52:57.866583 master-0 kubenswrapper[6932]: I0319 11:52:57.866490 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cni-binary-copy\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:52:57.866583 master-0 kubenswrapper[6932]: I0319 11:52:57.866491 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-ovnkube-script-lib\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.866583 master-0 kubenswrapper[6932]: I0319 11:52:57.866568 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:52:57.866702 master-0 kubenswrapper[6932]: I0319 11:52:57.866613 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-client\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:57.866702 master-0 kubenswrapper[6932]: I0319 11:52:57.866631 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72jlb\" (UniqueName: \"kubernetes.io/projected/daf4dbb6-5a0a-4c92-a930-479a7330ace1-kube-api-access-72jlb\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:52:57.866702 master-0 kubenswrapper[6932]: I0319 11:52:57.866693 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-netns\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.866835 master-0 kubenswrapper[6932]: I0319 11:52:57.866810 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-ovnkube-script-lib\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.866876 master-0 kubenswrapper[6932]: I0319 11:52:57.866798 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-cni-bin\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.869064 master-0 kubenswrapper[6932]: I0319 11:52:57.867195 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-config\") pod \"service-ca-operator-b865698dc-md7m5\" (UID: \"f5d73fef-1414-4b29-97ea-42e1c0b1ef18\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" Mar 19 11:52:57.869179 master-0 kubenswrapper[6932]: I0319 11:52:57.869141 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66f88242-8b0b-4790-bbb6-445c19b34ee7-config\") pod \"openshift-apiserver-operator-d65958b8-6hsqn\" (UID: \"66f88242-8b0b-4790-bbb6-445c19b34ee7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" Mar 19 11:52:57.869226 master-0 kubenswrapper[6932]: I0319 11:52:57.869208 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-system-cni-dir\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:52:57.869313 master-0 kubenswrapper[6932]: I0319 11:52:57.869290 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-cnibin\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.869361 master-0 kubenswrapper[6932]: I0319 11:52:57.869315 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-config\") pod \"service-ca-operator-b865698dc-md7m5\" (UID: \"f5d73fef-1414-4b29-97ea-42e1c0b1ef18\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" Mar 19 11:52:57.869436 master-0 kubenswrapper[6932]: I0319 11:52:57.869412 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:52:57.869713 master-0 kubenswrapper[6932]: I0319 11:52:57.869677 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlr9q\" (UniqueName: \"kubernetes.io/projected/b3de8a1b-a5be-414f-86e8-738e16c8bc97-kube-api-access-nlr9q\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:52:57.869782 master-0 kubenswrapper[6932]: I0319 11:52:57.869715 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-cni-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.869782 master-0 kubenswrapper[6932]: I0319 11:52:57.869747 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:52:57.869782 master-0 kubenswrapper[6932]: I0319 11:52:57.869768 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nfnb\" (UniqueName: \"kubernetes.io/projected/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-kube-api-access-7nfnb\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:57.869782 master-0 kubenswrapper[6932]: I0319 11:52:57.869785 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzrh8\" (UniqueName: \"kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8\") pod \"network-check-target-cr8n7\" (UID: \"6230ed8f-4608-4168-8f5a-656f411b0ef7\") " pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:57.869899 master-0 kubenswrapper[6932]: I0319 11:52:57.869803 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-node-log\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.869899 master-0 kubenswrapper[6932]: I0319 11:52:57.869820 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-systemd\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.869899 master-0 kubenswrapper[6932]: I0319 11:52:57.869836 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:52:57.869899 master-0 kubenswrapper[6932]: I0319 11:52:57.869854 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:57.869899 master-0 kubenswrapper[6932]: I0319 11:52:57.869876 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:57.869899 master-0 kubenswrapper[6932]: I0319 11:52:57.869894 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:52:57.870050 master-0 kubenswrapper[6932]: I0319 11:52:57.869920 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w\" (UID: \"b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" Mar 19 11:52:57.870050 master-0 kubenswrapper[6932]: I0319 11:52:57.869955 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-ca\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:57.870288 master-0 kubenswrapper[6932]: I0319 11:52:57.870253 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:57.870389 master-0 kubenswrapper[6932]: I0319 11:52:57.870361 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:57.870552 master-0 kubenswrapper[6932]: I0319 11:52:57.870501 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:52:57.870552 master-0 kubenswrapper[6932]: I0319 11:52:57.870548 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-wdwkz\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:52:57.870652 master-0 kubenswrapper[6932]: I0319 11:52:57.870573 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-log-socket\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.870652 master-0 kubenswrapper[6932]: I0319 11:52:57.870589 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:52:57.870652 master-0 kubenswrapper[6932]: I0319 11:52:57.870609 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/163d6a3d-0080-4122-bb7a-17f6e63f00f0-bound-sa-token\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:57.870652 master-0 kubenswrapper[6932]: I0319 11:52:57.870644 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:57.870817 master-0 kubenswrapper[6932]: I0319 11:52:57.870665 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbcbba74-ac53-4724-a217-4d9b85e7c1db-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-5gvgh\" (UID: \"dbcbba74-ac53-4724-a217-4d9b85e7c1db\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" Mar 19 11:52:57.870817 master-0 kubenswrapper[6932]: I0319 11:52:57.870684 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:52:57.870817 master-0 kubenswrapper[6932]: I0319 11:52:57.870704 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgs4l\" (UniqueName: \"kubernetes.io/projected/f29b11ce-60e0-46b3-8d28-eea3452513cd-kube-api-access-bgs4l\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:57.870817 master-0 kubenswrapper[6932]: I0319 11:52:57.870721 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.870817 master-0 kubenswrapper[6932]: I0319 11:52:57.870767 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-socket-dir-parent\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.870990 master-0 kubenswrapper[6932]: I0319 11:52:57.870830 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-k8s-cni-cncf-io\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.870990 master-0 kubenswrapper[6932]: I0319 11:52:57.870881 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39d3ac31-9259-454b-8e1c-e23024f8f2b2-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-pjc7h\" (UID: \"39d3ac31-9259-454b-8e1c-e23024f8f2b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" Mar 19 11:52:57.870990 master-0 kubenswrapper[6932]: I0319 11:52:57.870901 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p4hg\" (UniqueName: \"kubernetes.io/projected/6611e325-6152-480c-9c2c-1b503e49ccd2-kube-api-access-4p4hg\") pod \"cluster-olm-operator-67dcd4998-rgbzk\" (UID: \"6611e325-6152-480c-9c2c-1b503e49ccd2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" Mar 19 11:52:57.870990 master-0 kubenswrapper[6932]: I0319 11:52:57.870920 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-config\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:57.870990 master-0 kubenswrapper[6932]: I0319 11:52:57.870938 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:52:57.870990 master-0 kubenswrapper[6932]: I0319 11:52:57.870962 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:52:57.870990 master-0 kubenswrapper[6932]: I0319 11:52:57.870979 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:52:57.870990 master-0 kubenswrapper[6932]: I0319 11:52:57.870997 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-etc-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.871201 master-0 kubenswrapper[6932]: I0319 11:52:57.871014 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:57.871201 master-0 kubenswrapper[6932]: I0319 11:52:57.871034 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qql5t\" (UniqueName: \"kubernetes.io/projected/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-kube-api-access-qql5t\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w\" (UID: \"b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" Mar 19 11:52:57.871201 master-0 kubenswrapper[6932]: I0319 11:52:57.871057 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-4wj9n\" (UID: \"9b61ea14-a7ea-49f3-9df4-5655765ddf7c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" Mar 19 11:52:57.871201 master-0 kubenswrapper[6932]: I0319 11:52:57.871075 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8bmw\" (UniqueName: \"kubernetes.io/projected/716c2176-50f9-4c4f-af0e-4c7973457df2-kube-api-access-m8bmw\") pod \"olm-operator-5c9796789-l9sw9\" (UID: \"716c2176-50f9-4c4f-af0e-4c7973457df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:52:57.871201 master-0 kubenswrapper[6932]: I0319 11:52:57.871183 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-etc-kubernetes\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.871322 master-0 kubenswrapper[6932]: I0319 11:52:57.871222 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrgqb\" (UniqueName: \"kubernetes.io/projected/a3ceeece-bee9-4fcb-8517-95ebce38e223-kube-api-access-zrgqb\") pod \"openshift-config-operator-95bf4f4d-ng9ss\" (UID: \"a3ceeece-bee9-4fcb-8517-95ebce38e223\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:52:57.871322 master-0 kubenswrapper[6932]: I0319 11:52:57.871241 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-ovn\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.871322 master-0 kubenswrapper[6932]: I0319 11:52:57.871262 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rt57\" (UniqueName: \"kubernetes.io/projected/2292109e-92a9-4286-858e-dcd2ac083c43-kube-api-access-8rt57\") pod \"csi-snapshot-controller-operator-5f5d689c6b-fx8ng\" (UID: \"2292109e-92a9-4286-858e-dcd2ac083c43\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-fx8ng" Mar 19 11:52:57.871322 master-0 kubenswrapper[6932]: I0319 11:52:57.871283 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/daf4dbb6-5a0a-4c92-a930-479a7330ace1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:52:57.871322 master-0 kubenswrapper[6932]: I0319 11:52:57.871303 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pngsr\" (UniqueName: \"kubernetes.io/projected/3c3b0d24-ce5e-49c3-a546-874356f75dc6-kube-api-access-pngsr\") pod \"network-operator-7bd846bfc4-7fz6w\" (UID: \"3c3b0d24-ce5e-49c3-a546-874356f75dc6\") " pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" Mar 19 11:52:57.871322 master-0 kubenswrapper[6932]: I0319 11:52:57.871322 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/09a22c25-6073-4b1a-a029-928452ef37db-multus-daemon-config\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.871467 master-0 kubenswrapper[6932]: I0319 11:52:57.871343 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-6ghdm\" (UID: \"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" Mar 19 11:52:57.871467 master-0 kubenswrapper[6932]: I0319 11:52:57.871365 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39d3ac31-9259-454b-8e1c-e23024f8f2b2-config\") pod \"kube-apiserver-operator-8b68b9d9b-pjc7h\" (UID: \"39d3ac31-9259-454b-8e1c-e23024f8f2b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" Mar 19 11:52:57.871467 master-0 kubenswrapper[6932]: I0319 11:52:57.871370 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-ca\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:57.871467 master-0 kubenswrapper[6932]: I0319 11:52:57.871383 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/6611e325-6152-480c-9c2c-1b503e49ccd2-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-rgbzk\" (UID: \"6611e325-6152-480c-9c2c-1b503e49ccd2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" Mar 19 11:52:57.871467 master-0 kubenswrapper[6932]: I0319 11:52:57.871405 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.871467 master-0 kubenswrapper[6932]: I0319 11:52:57.871431 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-cni-netd\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.871467 master-0 kubenswrapper[6932]: I0319 11:52:57.871458 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5rm4\" (UniqueName: \"kubernetes.io/projected/e5078f17-bc65-460f-9f18-8c506db6840b-kube-api-access-s5rm4\") pod \"package-server-manager-7b95f86987-jq5vq\" (UID: \"e5078f17-bc65-460f-9f18-8c506db6840b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:52:57.871664 master-0 kubenswrapper[6932]: I0319 11:52:57.871482 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v27lg\" (UniqueName: \"kubernetes.io/projected/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-kube-api-access-v27lg\") pod \"service-ca-operator-b865698dc-md7m5\" (UID: \"f5d73fef-1414-4b29-97ea-42e1c0b1ef18\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" Mar 19 11:52:57.871664 master-0 kubenswrapper[6932]: I0319 11:52:57.871503 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-serving-cert\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:57.871664 master-0 kubenswrapper[6932]: I0319 11:52:57.871522 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:52:57.871664 master-0 kubenswrapper[6932]: I0319 11:52:57.871319 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-4wj9n\" (UID: \"9b61ea14-a7ea-49f3-9df4-5655765ddf7c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" Mar 19 11:52:57.871809 master-0 kubenswrapper[6932]: I0319 11:52:57.871694 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-cni-multus\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.871809 master-0 kubenswrapper[6932]: I0319 11:52:57.871759 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/39d3ac31-9259-454b-8e1c-e23024f8f2b2-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-pjc7h\" (UID: \"39d3ac31-9259-454b-8e1c-e23024f8f2b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" Mar 19 11:52:57.871809 master-0 kubenswrapper[6932]: I0319 11:52:57.871782 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-serving-cert\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:57.871809 master-0 kubenswrapper[6932]: I0319 11:52:57.871800 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnl28\" (UniqueName: \"kubernetes.io/projected/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-kube-api-access-dnl28\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:57.871916 master-0 kubenswrapper[6932]: I0319 11:52:57.871830 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqr6w\" (UniqueName: \"kubernetes.io/projected/8438d015-106b-4aed-ae12-dda781ce51fc-kube-api-access-cqr6w\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:52:57.871916 master-0 kubenswrapper[6932]: I0319 11:52:57.871834 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w\" (UID: \"b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" Mar 19 11:52:57.871916 master-0 kubenswrapper[6932]: I0319 11:52:57.871858 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqcvx\" (UniqueName: \"kubernetes.io/projected/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-kube-api-access-lqcvx\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:52:57.871916 master-0 kubenswrapper[6932]: I0319 11:52:57.871883 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:52:57.871916 master-0 kubenswrapper[6932]: I0319 11:52:57.871908 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:57.872087 master-0 kubenswrapper[6932]: I0319 11:52:57.871956 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-cni-bin\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.872249 master-0 kubenswrapper[6932]: I0319 11:52:57.872190 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbcbba74-ac53-4724-a217-4d9b85e7c1db-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-5gvgh\" (UID: \"dbcbba74-ac53-4724-a217-4d9b85e7c1db\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" Mar 19 11:52:57.872362 master-0 kubenswrapper[6932]: I0319 11:52:57.872332 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66f88242-8b0b-4790-bbb6-445c19b34ee7-config\") pod \"openshift-apiserver-operator-d65958b8-6hsqn\" (UID: \"66f88242-8b0b-4790-bbb6-445c19b34ee7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" Mar 19 11:52:57.872415 master-0 kubenswrapper[6932]: I0319 11:52:57.872390 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39d3ac31-9259-454b-8e1c-e23024f8f2b2-config\") pod \"kube-apiserver-operator-8b68b9d9b-pjc7h\" (UID: \"39d3ac31-9259-454b-8e1c-e23024f8f2b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" Mar 19 11:52:57.872528 master-0 kubenswrapper[6932]: I0319 11:52:57.872505 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-6ghdm\" (UID: \"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" Mar 19 11:52:57.872652 master-0 kubenswrapper[6932]: I0319 11:52:57.872623 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-config\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:57.872894 master-0 kubenswrapper[6932]: I0319 11:52:57.872847 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/09a22c25-6073-4b1a-a029-928452ef37db-multus-daemon-config\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.872894 master-0 kubenswrapper[6932]: I0319 11:52:57.872872 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/6611e325-6152-480c-9c2c-1b503e49ccd2-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-rgbzk\" (UID: \"6611e325-6152-480c-9c2c-1b503e49ccd2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" Mar 19 11:52:57.873111 master-0 kubenswrapper[6932]: I0319 11:52:57.873021 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/daf4dbb6-5a0a-4c92-a930-479a7330ace1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:52:57.873476 master-0 kubenswrapper[6932]: I0319 11:52:57.873424 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39d3ac31-9259-454b-8e1c-e23024f8f2b2-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-pjc7h\" (UID: \"39d3ac31-9259-454b-8e1c-e23024f8f2b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" Mar 19 11:52:57.874556 master-0 kubenswrapper[6932]: I0319 11:52:57.874464 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:52:57.874610 master-0 kubenswrapper[6932]: I0319 11:52:57.872109 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:52:57.972848 master-0 kubenswrapper[6932]: I0319 11:52:57.972676 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-netns\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.972848 master-0 kubenswrapper[6932]: I0319 11:52:57.972813 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-cni-bin\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.972848 master-0 kubenswrapper[6932]: I0319 11:52:57.972842 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-host-slash\") pod \"iptables-alerter-n52gc\" (UID: \"4e2c195f-e97d-4cac-81fc-2d5c551d1c30\") " pod="openshift-network-operator/iptables-alerter-n52gc" Mar 19 11:52:57.972848 master-0 kubenswrapper[6932]: I0319 11:52:57.972866 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:52:57.973212 master-0 kubenswrapper[6932]: I0319 11:52:57.972899 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-system-cni-dir\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:52:57.973212 master-0 kubenswrapper[6932]: I0319 11:52:57.972981 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:52:57.973212 master-0 kubenswrapper[6932]: I0319 11:52:57.972998 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-netns\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.973212 master-0 kubenswrapper[6932]: I0319 11:52:57.973145 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-cni-bin\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.973212 master-0 kubenswrapper[6932]: I0319 11:52:57.973195 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-system-cni-dir\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:52:57.973376 master-0 kubenswrapper[6932]: I0319 11:52:57.973251 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-cnibin\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.973376 master-0 kubenswrapper[6932]: I0319 11:52:57.973280 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-cni-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.973376 master-0 kubenswrapper[6932]: I0319 11:52:57.973320 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-host-slash\") pod \"iptables-alerter-n52gc\" (UID: \"4e2c195f-e97d-4cac-81fc-2d5c551d1c30\") " pod="openshift-network-operator/iptables-alerter-n52gc" Mar 19 11:52:57.973457 master-0 kubenswrapper[6932]: I0319 11:52:57.973366 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:52:57.973457 master-0 kubenswrapper[6932]: I0319 11:52:57.973397 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:52:57.973457 master-0 kubenswrapper[6932]: I0319 11:52:57.973373 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-cnibin\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.973457 master-0 kubenswrapper[6932]: I0319 11:52:57.973425 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzrh8\" (UniqueName: \"kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8\") pod \"network-check-target-cr8n7\" (UID: \"6230ed8f-4608-4168-8f5a-656f411b0ef7\") " pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:57.973457 master-0 kubenswrapper[6932]: I0319 11:52:57.973453 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-node-log\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.973457 master-0 kubenswrapper[6932]: I0319 11:52:57.973424 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-cni-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.973605 master-0 kubenswrapper[6932]: I0319 11:52:57.973499 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-node-log\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.973605 master-0 kubenswrapper[6932]: I0319 11:52:57.973472 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:52:57.973605 master-0 kubenswrapper[6932]: I0319 11:52:57.973534 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:52:57.973605 master-0 kubenswrapper[6932]: I0319 11:52:57.973541 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-systemd\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.973605 master-0 kubenswrapper[6932]: I0319 11:52:57.973559 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-systemd\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.973605 master-0 kubenswrapper[6932]: I0319 11:52:57.973559 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:52:57.973605 master-0 kubenswrapper[6932]: I0319 11:52:57.973593 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:52:57.973827 master-0 kubenswrapper[6932]: I0319 11:52:57.973608 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-wdwkz\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:52:57.973827 master-0 kubenswrapper[6932]: I0319 11:52:57.973638 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-log-socket\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.973827 master-0 kubenswrapper[6932]: I0319 11:52:57.973660 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:57.973827 master-0 kubenswrapper[6932]: I0319 11:52:57.973680 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:52:57.973827 master-0 kubenswrapper[6932]: I0319 11:52:57.973696 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-log-socket\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.973827 master-0 kubenswrapper[6932]: E0319 11:52:57.973696 6932 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 11:52:57.973827 master-0 kubenswrapper[6932]: I0319 11:52:57.973709 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.973827 master-0 kubenswrapper[6932]: I0319 11:52:57.973757 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:52:57.973827 master-0 kubenswrapper[6932]: E0319 11:52:57.973762 6932 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 11:52:57.973827 master-0 kubenswrapper[6932]: E0319 11:52:57.973782 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs podName:89cf2ee8-3664-4502-b70c-b7e0a5e92cb7 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:58.473758371 +0000 UTC m=+2.832818593 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-wdwkz" (UID: "89cf2ee8-3664-4502-b70c-b7e0a5e92cb7") : secret "multus-admission-controller-secret" not found Mar 19 11:52:57.973827 master-0 kubenswrapper[6932]: I0319 11:52:57.973786 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:52:57.973827 master-0 kubenswrapper[6932]: I0319 11:52:57.973819 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:52:57.973827 master-0 kubenswrapper[6932]: I0319 11:52:57.973831 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.974187 master-0 kubenswrapper[6932]: E0319 11:52:57.973885 6932 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 11:52:57.974187 master-0 kubenswrapper[6932]: E0319 11:52:57.973915 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls podName:aaaaf539-bf61-44d7-8d47-97535b7aa1ba nodeName:}" failed. No retries permitted until 2026-03-19 11:52:58.473835743 +0000 UTC m=+2.832895965 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-kb5vd" (UID: "aaaaf539-bf61-44d7-8d47-97535b7aa1ba") : secret "node-tuning-operator-tls" not found Mar 19 11:52:57.974187 master-0 kubenswrapper[6932]: I0319 11:52:57.973968 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-socket-dir-parent\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.974187 master-0 kubenswrapper[6932]: I0319 11:52:57.973962 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:52:57.974187 master-0 kubenswrapper[6932]: I0319 11:52:57.973997 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-socket-dir-parent\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.974187 master-0 kubenswrapper[6932]: E0319 11:52:57.974020 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls podName:681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:58.473995076 +0000 UTC m=+2.833055298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-tkcwh" (UID: "681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9") : secret "cluster-monitoring-operator-tls" not found Mar 19 11:52:57.974187 master-0 kubenswrapper[6932]: I0319 11:52:57.974101 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-k8s-cni-cncf-io\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.974187 master-0 kubenswrapper[6932]: I0319 11:52:57.974129 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-k8s-cni-cncf-io\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.974187 master-0 kubenswrapper[6932]: I0319 11:52:57.974160 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:52:57.974187 master-0 kubenswrapper[6932]: I0319 11:52:57.974184 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-etc-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.974557 master-0 kubenswrapper[6932]: I0319 11:52:57.974202 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:57.974557 master-0 kubenswrapper[6932]: I0319 11:52:57.974210 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:52:57.974557 master-0 kubenswrapper[6932]: E0319 11:52:57.974309 6932 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:52:57.974557 master-0 kubenswrapper[6932]: E0319 11:52:57.974351 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls podName:163d6a3d-0080-4122-bb7a-17f6e63f00f0 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:58.474344084 +0000 UTC m=+2.833404296 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls") pod "ingress-operator-66b84d69b-qrjj4" (UID: "163d6a3d-0080-4122-bb7a-17f6e63f00f0") : secret "metrics-tls" not found Mar 19 11:52:57.974557 master-0 kubenswrapper[6932]: I0319 11:52:57.974393 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-etc-kubernetes\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.974557 master-0 kubenswrapper[6932]: I0319 11:52:57.974419 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-ovn\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.974557 master-0 kubenswrapper[6932]: I0319 11:52:57.974448 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.974557 master-0 kubenswrapper[6932]: I0319 11:52:57.974465 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-cni-netd\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.974557 master-0 kubenswrapper[6932]: I0319 11:52:57.974507 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-cni-multus\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.974557 master-0 kubenswrapper[6932]: I0319 11:52:57.974542 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-cni-bin\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.974557 master-0 kubenswrapper[6932]: I0319 11:52:57.974550 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-ovn\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974558 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974585 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-cni-multus\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974578 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-cni-netd\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974602 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974613 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-cni-bin\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974622 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974642 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974656 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-os-release\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: E0319 11:52:57.974664 6932 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974664 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-etc-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: E0319 11:52:57.974687 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs podName:f29b11ce-60e0-46b3-8d28-eea3452513cd nodeName:}" failed. No retries permitted until 2026-03-19 11:52:58.474678453 +0000 UTC m=+2.833738675 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs") pod "network-metrics-daemon-f6wv7" (UID: "f29b11ce-60e0-46b3-8d28-eea3452513cd") : secret "metrics-daemon-secret" not found Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974712 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3c3b0d24-ce5e-49c3-a546-874356f75dc6-host-etc-kube\") pod \"network-operator-7bd846bfc4-7fz6w\" (UID: \"3c3b0d24-ce5e-49c3-a546-874356f75dc6\") " pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974753 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-etc-kubernetes\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974772 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-var-lib-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974756 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-var-lib-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974797 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974713 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-os-release\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974826 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3c3b0d24-ce5e-49c3-a546-874356f75dc6-host-etc-kube\") pod \"network-operator-7bd846bfc4-7fz6w\" (UID: \"3c3b0d24-ce5e-49c3-a546-874356f75dc6\") " pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974844 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974871 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974888 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974876 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974925 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974938 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974948 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-kubelet\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974959 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974973 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-kubelet\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974976 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-conf-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.974994 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-conf-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.975007 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-os-release\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.975024 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-hostroot\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.975002 master-0 kubenswrapper[6932]: I0319 11:52:57.975043 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-slash\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975067 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-slash\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975080 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-os-release\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975107 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-hostroot\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975159 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls\") pod \"dns-operator-9c5679d8f-965np\" (UID: \"22e10648-af7c-409e-b947-570e7d807e05\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975178 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert\") pod \"olm-operator-5c9796789-l9sw9\" (UID: \"716c2176-50f9-4c4f-af0e-4c7973457df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: E0319 11:52:57.975218 6932 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975218 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: E0319 11:52:57.975248 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls podName:22e10648-af7c-409e-b947-570e7d807e05 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:58.475238675 +0000 UTC m=+2.834298897 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls") pod "dns-operator-9c5679d8f-965np" (UID: "22e10648-af7c-409e-b947-570e7d807e05") : secret "metrics-tls" not found Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975262 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/333047c4-aeca-410e-9393-ca4e74366921-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: E0319 11:52:57.975288 6932 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975294 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert\") pod \"catalog-operator-68f85b4d6c-n5gr9\" (UID: \"cf08ab4f-c203-4c16-9826-8cc049f4af31\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: E0319 11:52:57.975333 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert podName:716c2176-50f9-4c4f-af0e-4c7973457df2 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:58.475312727 +0000 UTC m=+2.834373049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert") pod "olm-operator-5c9796789-l9sw9" (UID: "716c2176-50f9-4c4f-af0e-4c7973457df2") : secret "olm-operator-serving-cert" not found Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975355 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cnibin\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975359 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/333047c4-aeca-410e-9393-ca4e74366921-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: E0319 11:52:57.975364 6932 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975395 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cnibin\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: E0319 11:52:57.975411 6932 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: E0319 11:52:57.975425 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics podName:b3de8a1b-a5be-414f-86e8-738e16c8bc97 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:58.475403249 +0000 UTC m=+2.834463471 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-bftt4" (UID: "b3de8a1b-a5be-414f-86e8-738e16c8bc97") : secret "marketplace-operator-metrics" not found Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975447 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-system-cni-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: E0319 11:52:57.975455 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert podName:cf08ab4f-c203-4c16-9826-8cc049f4af31 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:58.47544487 +0000 UTC m=+2.834505092 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert") pod "catalog-operator-68f85b4d6c-n5gr9" (UID: "cf08ab4f-c203-4c16-9826-8cc049f4af31") : secret "catalog-operator-serving-cert" not found Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975480 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975500 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975528 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-system-cni-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975539 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975541 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975554 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-jq5vq\" (UID: \"e5078f17-bc65-460f-9f18-8c506db6840b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975591 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: E0319 11:52:57.975601 6932 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: E0319 11:52:57.975626 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert podName:e5078f17-bc65-460f-9f18-8c506db6840b nodeName:}" failed. No retries permitted until 2026-03-19 11:52:58.475619984 +0000 UTC m=+2.834680206 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-jq5vq" (UID: "e5078f17-bc65-460f-9f18-8c506db6840b") : secret "package-server-manager-serving-cert" not found Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975640 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: E0319 11:52:57.975663 6932 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975663 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975688 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-systemd-units\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: E0319 11:52:57.975696 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert podName:333047c4-aeca-410e-9393-ca4e74366921 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:58.475687926 +0000 UTC m=+2.834748148 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert") pod "cluster-version-operator-56d8475767-pk574" (UID: "333047c4-aeca-410e-9393-ca4e74366921") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975694 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: E0319 11:52:57.975764 6932 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975784 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-run-netns\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: E0319 11:52:57.975793 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert podName:aaaaf539-bf61-44d7-8d47-97535b7aa1ba nodeName:}" failed. No retries permitted until 2026-03-19 11:52:58.475784718 +0000 UTC m=+2.834844940 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-kb5vd" (UID: "aaaaf539-bf61-44d7-8d47-97535b7aa1ba") : secret "performance-addon-operator-webhook-cert" not found Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975804 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975840 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975840 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-systemd-units\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975854 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-kubelet\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975881 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-kubelet\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975857 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-run-netns\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975893 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-run-ovn-kubernetes\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.975871 master-0 kubenswrapper[6932]: I0319 11:52:57.975915 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/333047c4-aeca-410e-9393-ca4e74366921-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:52:57.977538 master-0 kubenswrapper[6932]: I0319 11:52:57.975940 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-multus-certs\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.977538 master-0 kubenswrapper[6932]: I0319 11:52:57.975958 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:57.977538 master-0 kubenswrapper[6932]: I0319 11:52:57.975978 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:52:57.977538 master-0 kubenswrapper[6932]: I0319 11:52:57.975993 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/333047c4-aeca-410e-9393-ca4e74366921-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:52:57.977538 master-0 kubenswrapper[6932]: I0319 11:52:57.976010 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:52:57.977538 master-0 kubenswrapper[6932]: I0319 11:52:57.975998 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-multus-certs\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:57.977538 master-0 kubenswrapper[6932]: I0319 11:52:57.976011 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:52:57.977538 master-0 kubenswrapper[6932]: I0319 11:52:57.976034 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:52:57.977538 master-0 kubenswrapper[6932]: I0319 11:52:57.976041 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-run-ovn-kubernetes\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:57.977538 master-0 kubenswrapper[6932]: E0319 11:52:57.976061 6932 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 11:52:57.977538 master-0 kubenswrapper[6932]: E0319 11:52:57.976103 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls podName:a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:58.476088845 +0000 UTC m=+2.835149067 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-nrtp2" (UID: "a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1") : secret "image-registry-operator-tls" not found Mar 19 11:52:58.460140 master-0 kubenswrapper[6932]: E0319 11:52:58.459402 6932 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:52:58.460140 master-0 kubenswrapper[6932]: W0319 11:52:58.459950 6932 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 19 11:52:58.460140 master-0 kubenswrapper[6932]: E0319 11:52:58.459979 6932 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: I0319 11:52:58.482475 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-wdwkz\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: I0319 11:52:58.482546 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: I0319 11:52:58.482578 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: I0319 11:52:58.482598 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: I0319 11:52:58.482683 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: I0319 11:52:58.482756 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls\") pod \"dns-operator-9c5679d8f-965np\" (UID: \"22e10648-af7c-409e-b947-570e7d807e05\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: I0319 11:52:58.482783 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert\") pod \"olm-operator-5c9796789-l9sw9\" (UID: \"716c2176-50f9-4c4f-af0e-4c7973457df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: I0319 11:52:58.482846 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: I0319 11:52:58.482888 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert\") pod \"catalog-operator-68f85b4d6c-n5gr9\" (UID: \"cf08ab4f-c203-4c16-9826-8cc049f4af31\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: I0319 11:52:58.482923 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: I0319 11:52:58.482947 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-jq5vq\" (UID: \"e5078f17-bc65-460f-9f18-8c506db6840b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: I0319 11:52:58.482972 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: I0319 11:52:58.483028 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: E0319 11:52:58.483230 6932 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: E0319 11:52:58.483283 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls podName:a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:59.48326756 +0000 UTC m=+3.842327792 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-nrtp2" (UID: "a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1") : secret "image-registry-operator-tls" not found Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: E0319 11:52:58.483338 6932 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: E0319 11:52:58.483393 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs podName:89cf2ee8-3664-4502-b70c-b7e0a5e92cb7 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:59.483353702 +0000 UTC m=+3.842413934 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-wdwkz" (UID: "89cf2ee8-3664-4502-b70c-b7e0a5e92cb7") : secret "multus-admission-controller-secret" not found Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: E0319 11:52:58.483438 6932 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: E0319 11:52:58.483466 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls podName:aaaaf539-bf61-44d7-8d47-97535b7aa1ba nodeName:}" failed. No retries permitted until 2026-03-19 11:52:59.483455105 +0000 UTC m=+3.842515337 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-kb5vd" (UID: "aaaaf539-bf61-44d7-8d47-97535b7aa1ba") : secret "node-tuning-operator-tls" not found Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: E0319 11:52:58.483510 6932 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: E0319 11:52:58.483534 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls podName:681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:59.483526466 +0000 UTC m=+3.842586688 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-tkcwh" (UID: "681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9") : secret "cluster-monitoring-operator-tls" not found Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: E0319 11:52:58.483578 6932 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: E0319 11:52:58.483599 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls podName:163d6a3d-0080-4122-bb7a-17f6e63f00f0 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:59.483592538 +0000 UTC m=+3.842652770 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls") pod "ingress-operator-66b84d69b-qrjj4" (UID: "163d6a3d-0080-4122-bb7a-17f6e63f00f0") : secret "metrics-tls" not found Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: E0319 11:52:58.483644 6932 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: E0319 11:52:58.483665 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs podName:f29b11ce-60e0-46b3-8d28-eea3452513cd nodeName:}" failed. No retries permitted until 2026-03-19 11:52:59.483658529 +0000 UTC m=+3.842718761 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs") pod "network-metrics-daemon-f6wv7" (UID: "f29b11ce-60e0-46b3-8d28-eea3452513cd") : secret "metrics-daemon-secret" not found Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: E0319 11:52:58.483708 6932 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: E0319 11:52:58.483747 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls podName:22e10648-af7c-409e-b947-570e7d807e05 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:59.483721631 +0000 UTC m=+3.842781863 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls") pod "dns-operator-9c5679d8f-965np" (UID: "22e10648-af7c-409e-b947-570e7d807e05") : secret "metrics-tls" not found Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: E0319 11:52:58.483794 6932 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: E0319 11:52:58.483819 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert podName:716c2176-50f9-4c4f-af0e-4c7973457df2 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:59.483812373 +0000 UTC m=+3.842872595 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert") pod "olm-operator-5c9796789-l9sw9" (UID: "716c2176-50f9-4c4f-af0e-4c7973457df2") : secret "olm-operator-serving-cert" not found Mar 19 11:52:58.483803 master-0 kubenswrapper[6932]: E0319 11:52:58.483864 6932 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 11:52:58.484970 master-0 kubenswrapper[6932]: E0319 11:52:58.483888 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics podName:b3de8a1b-a5be-414f-86e8-738e16c8bc97 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:59.483879834 +0000 UTC m=+3.842940056 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-bftt4" (UID: "b3de8a1b-a5be-414f-86e8-738e16c8bc97") : secret "marketplace-operator-metrics" not found Mar 19 11:52:58.484970 master-0 kubenswrapper[6932]: E0319 11:52:58.483931 6932 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 11:52:58.484970 master-0 kubenswrapper[6932]: E0319 11:52:58.483952 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert podName:cf08ab4f-c203-4c16-9826-8cc049f4af31 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:59.483945416 +0000 UTC m=+3.843005638 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert") pod "catalog-operator-68f85b4d6c-n5gr9" (UID: "cf08ab4f-c203-4c16-9826-8cc049f4af31") : secret "catalog-operator-serving-cert" not found Mar 19 11:52:58.484970 master-0 kubenswrapper[6932]: E0319 11:52:58.483994 6932 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 11:52:58.484970 master-0 kubenswrapper[6932]: E0319 11:52:58.484021 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert podName:aaaaf539-bf61-44d7-8d47-97535b7aa1ba nodeName:}" failed. No retries permitted until 2026-03-19 11:52:59.484014117 +0000 UTC m=+3.843074339 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-kb5vd" (UID: "aaaaf539-bf61-44d7-8d47-97535b7aa1ba") : secret "performance-addon-operator-webhook-cert" not found Mar 19 11:52:58.484970 master-0 kubenswrapper[6932]: E0319 11:52:58.484065 6932 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 11:52:58.484970 master-0 kubenswrapper[6932]: E0319 11:52:58.484087 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert podName:e5078f17-bc65-460f-9f18-8c506db6840b nodeName:}" failed. No retries permitted until 2026-03-19 11:52:59.48407985 +0000 UTC m=+3.843140072 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-jq5vq" (UID: "e5078f17-bc65-460f-9f18-8c506db6840b") : secret "package-server-manager-serving-cert" not found Mar 19 11:52:58.484970 master-0 kubenswrapper[6932]: E0319 11:52:58.484129 6932 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:52:58.484970 master-0 kubenswrapper[6932]: E0319 11:52:58.484149 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert podName:333047c4-aeca-410e-9393-ca4e74366921 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:59.484142381 +0000 UTC m=+3.843202603 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert") pod "cluster-version-operator-56d8475767-pk574" (UID: "333047c4-aeca-410e-9393-ca4e74366921") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:52:59.433334 master-0 kubenswrapper[6932]: I0319 11:52:59.433287 6932 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Mar 19 11:52:59.433868 master-0 kubenswrapper[6932]: I0319 11:52:59.433402 6932 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 19 11:52:59.444332 master-0 kubenswrapper[6932]: I0319 11:52:59.444293 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pngsr\" (UniqueName: \"kubernetes.io/projected/3c3b0d24-ce5e-49c3-a546-874356f75dc6-kube-api-access-pngsr\") pod \"network-operator-7bd846bfc4-7fz6w\" (UID: \"3c3b0d24-ce5e-49c3-a546-874356f75dc6\") " pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" Mar 19 11:52:59.446796 master-0 kubenswrapper[6932]: I0319 11:52:59.446755 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbcbba74-ac53-4724-a217-4d9b85e7c1db-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-5gvgh\" (UID: \"dbcbba74-ac53-4724-a217-4d9b85e7c1db\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" Mar 19 11:52:59.448269 master-0 kubenswrapper[6932]: I0319 11:52:59.448231 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/333047c4-aeca-410e-9393-ca4e74366921-kube-api-access\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:52:59.459819 master-0 kubenswrapper[6932]: I0319 11:52:59.459764 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx4wk\" (UniqueName: \"kubernetes.io/projected/09a22c25-6073-4b1a-a029-928452ef37db-kube-api-access-xx4wk\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:52:59.460210 master-0 kubenswrapper[6932]: E0319 11:52:59.460150 6932 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:52:59.460658 master-0 kubenswrapper[6932]: I0319 11:52:59.460630 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgz7q\" (UniqueName: \"kubernetes.io/projected/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-kube-api-access-kgz7q\") pod \"iptables-alerter-n52gc\" (UID: \"4e2c195f-e97d-4cac-81fc-2d5c551d1c30\") " pod="openshift-network-operator/iptables-alerter-n52gc" Mar 19 11:52:59.463758 master-0 kubenswrapper[6932]: I0319 11:52:59.461479 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46m89\" (UniqueName: \"kubernetes.io/projected/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-kube-api-access-46m89\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:59.463758 master-0 kubenswrapper[6932]: I0319 11:52:59.461717 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79qrb\" (UniqueName: \"kubernetes.io/projected/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-kube-api-access-79qrb\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:52:59.463758 master-0 kubenswrapper[6932]: I0319 11:52:59.462626 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72jlb\" (UniqueName: \"kubernetes.io/projected/daf4dbb6-5a0a-4c92-a930-479a7330ace1-kube-api-access-72jlb\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:52:59.463758 master-0 kubenswrapper[6932]: I0319 11:52:59.462688 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8bmw\" (UniqueName: \"kubernetes.io/projected/716c2176-50f9-4c4f-af0e-4c7973457df2-kube-api-access-m8bmw\") pod \"olm-operator-5c9796789-l9sw9\" (UID: \"716c2176-50f9-4c4f-af0e-4c7973457df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:52:59.463758 master-0 kubenswrapper[6932]: E0319 11:52:59.462850 6932 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:52:59.463758 master-0 kubenswrapper[6932]: I0319 11:52:59.463357 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrgqb\" (UniqueName: \"kubernetes.io/projected/a3ceeece-bee9-4fcb-8517-95ebce38e223-kube-api-access-zrgqb\") pod \"openshift-config-operator-95bf4f4d-ng9ss\" (UID: \"a3ceeece-bee9-4fcb-8517-95ebce38e223\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:52:59.471746 master-0 kubenswrapper[6932]: I0319 11:52:59.468832 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqr6w\" (UniqueName: \"kubernetes.io/projected/8438d015-106b-4aed-ae12-dda781ce51fc-kube-api-access-cqr6w\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:52:59.471746 master-0 kubenswrapper[6932]: I0319 11:52:59.469938 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7tc5\" (UniqueName: \"kubernetes.io/projected/163d6a3d-0080-4122-bb7a-17f6e63f00f0-kube-api-access-m7tc5\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:59.471746 master-0 kubenswrapper[6932]: I0319 11:52:59.470674 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqcvx\" (UniqueName: \"kubernetes.io/projected/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-kube-api-access-lqcvx\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:52:59.473320 master-0 kubenswrapper[6932]: I0319 11:52:59.472042 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkm97\" (UniqueName: \"kubernetes.io/projected/cf08ab4f-c203-4c16-9826-8cc049f4af31-kube-api-access-lkm97\") pod \"catalog-operator-68f85b4d6c-n5gr9\" (UID: \"cf08ab4f-c203-4c16-9826-8cc049f4af31\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:52:59.474760 master-0 kubenswrapper[6932]: I0319 11:52:59.474688 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wls49\" (UniqueName: \"kubernetes.io/projected/22e10648-af7c-409e-b947-570e7d807e05-kube-api-access-wls49\") pod \"dns-operator-9c5679d8f-965np\" (UID: \"22e10648-af7c-409e-b947-570e7d807e05\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:52:59.478747 master-0 kubenswrapper[6932]: I0319 11:52:59.475243 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5fnx\" (UniqueName: \"kubernetes.io/projected/66f88242-8b0b-4790-bbb6-445c19b34ee7-kube-api-access-p5fnx\") pod \"openshift-apiserver-operator-d65958b8-6hsqn\" (UID: \"66f88242-8b0b-4790-bbb6-445c19b34ee7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" Mar 19 11:52:59.486745 master-0 kubenswrapper[6932]: I0319 11:52:59.485347 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/39d3ac31-9259-454b-8e1c-e23024f8f2b2-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-pjc7h\" (UID: \"39d3ac31-9259-454b-8e1c-e23024f8f2b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" Mar 19 11:52:59.486745 master-0 kubenswrapper[6932]: I0319 11:52:59.485803 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/163d6a3d-0080-4122-bb7a-17f6e63f00f0-bound-sa-token\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:59.486745 master-0 kubenswrapper[6932]: E0319 11:52:59.485875 6932 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:52:59.486745 master-0 kubenswrapper[6932]: I0319 11:52:59.486251 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-4wj9n\" (UID: \"9b61ea14-a7ea-49f3-9df4-5655765ddf7c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" Mar 19 11:52:59.486745 master-0 kubenswrapper[6932]: I0319 11:52:59.486599 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:59.486745 master-0 kubenswrapper[6932]: I0319 11:52:59.486744 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bb2x\" (UniqueName: \"kubernetes.io/projected/3053504d-0734-4def-b639-0f5cc2178185-kube-api-access-2bb2x\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:52:59.486923 master-0 kubenswrapper[6932]: I0319 11:52:59.486775 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgs4l\" (UniqueName: \"kubernetes.io/projected/f29b11ce-60e0-46b3-8d28-eea3452513cd-kube-api-access-bgs4l\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:59.487752 master-0 kubenswrapper[6932]: I0319 11:52:59.486960 6932 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 19 11:52:59.487752 master-0 kubenswrapper[6932]: I0319 11:52:59.487340 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nds54\" (UniqueName: \"kubernetes.io/projected/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-kube-api-access-nds54\") pod \"openshift-controller-manager-operator-8c94f4649-6ghdm\" (UID: \"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" Mar 19 11:52:59.488776 master-0 kubenswrapper[6932]: I0319 11:52:59.487826 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfvz6\" (UniqueName: \"kubernetes.io/projected/732989c5-1b89-46f0-9917-b68613f7f005-kube-api-access-bfvz6\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:52:59.488776 master-0 kubenswrapper[6932]: I0319 11:52:59.487934 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v27lg\" (UniqueName: \"kubernetes.io/projected/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-kube-api-access-v27lg\") pod \"service-ca-operator-b865698dc-md7m5\" (UID: \"f5d73fef-1414-4b29-97ea-42e1c0b1ef18\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" Mar 19 11:52:59.488979 master-0 kubenswrapper[6932]: I0319 11:52:59.488866 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7spvn\" (UniqueName: \"kubernetes.io/projected/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-kube-api-access-7spvn\") pod \"multus-admission-controller-5dbbb8b86f-wdwkz\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:52:59.490891 master-0 kubenswrapper[6932]: I0319 11:52:59.490857 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qql5t\" (UniqueName: \"kubernetes.io/projected/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-kube-api-access-qql5t\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w\" (UID: \"b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" Mar 19 11:52:59.490959 master-0 kubenswrapper[6932]: I0319 11:52:59.490899 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnl28\" (UniqueName: \"kubernetes.io/projected/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-kube-api-access-dnl28\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:52:59.491037 master-0 kubenswrapper[6932]: I0319 11:52:59.491016 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzrh8\" (UniqueName: \"kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8\") pod \"network-check-target-cr8n7\" (UID: \"6230ed8f-4608-4168-8f5a-656f411b0ef7\") " pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:59.491660 master-0 kubenswrapper[6932]: I0319 11:52:59.491629 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nfnb\" (UniqueName: \"kubernetes.io/projected/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-kube-api-access-7nfnb\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:59.492143 master-0 kubenswrapper[6932]: I0319 11:52:59.492102 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5rm4\" (UniqueName: \"kubernetes.io/projected/e5078f17-bc65-460f-9f18-8c506db6840b-kube-api-access-s5rm4\") pod \"package-server-manager-7b95f86987-jq5vq\" (UID: \"e5078f17-bc65-460f-9f18-8c506db6840b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:52:59.492483 master-0 kubenswrapper[6932]: I0319 11:52:59.492441 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p4hg\" (UniqueName: \"kubernetes.io/projected/6611e325-6152-480c-9c2c-1b503e49ccd2-kube-api-access-4p4hg\") pod \"cluster-olm-operator-67dcd4998-rgbzk\" (UID: \"6611e325-6152-480c-9c2c-1b503e49ccd2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" Mar 19 11:52:59.492568 master-0 kubenswrapper[6932]: I0319 11:52:59.492537 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rt57\" (UniqueName: \"kubernetes.io/projected/2292109e-92a9-4286-858e-dcd2ac083c43-kube-api-access-8rt57\") pod \"csi-snapshot-controller-operator-5f5d689c6b-fx8ng\" (UID: \"2292109e-92a9-4286-858e-dcd2ac083c43\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-fx8ng" Mar 19 11:52:59.493949 master-0 kubenswrapper[6932]: I0319 11:52:59.493921 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlr9q\" (UniqueName: \"kubernetes.io/projected/b3de8a1b-a5be-414f-86e8-738e16c8bc97-kube-api-access-nlr9q\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:52:59.497959 master-0 kubenswrapper[6932]: I0319 11:52:59.497900 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:52:59.498044 master-0 kubenswrapper[6932]: I0319 11:52:59.497968 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert\") pod \"catalog-operator-68f85b4d6c-n5gr9\" (UID: \"cf08ab4f-c203-4c16-9826-8cc049f4af31\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:52:59.498044 master-0 kubenswrapper[6932]: I0319 11:52:59.498006 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:52:59.498107 master-0 kubenswrapper[6932]: I0319 11:52:59.498041 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:59.498211 master-0 kubenswrapper[6932]: E0319 11:52:59.498172 6932 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 11:52:59.498299 master-0 kubenswrapper[6932]: I0319 11:52:59.498264 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-jq5vq\" (UID: \"e5078f17-bc65-460f-9f18-8c506db6840b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:52:59.498335 master-0 kubenswrapper[6932]: E0319 11:52:59.498312 6932 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 11:52:59.498335 master-0 kubenswrapper[6932]: I0319 11:52:59.498325 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:52:59.498390 master-0 kubenswrapper[6932]: I0319 11:52:59.498375 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-wdwkz\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:52:59.498417 master-0 kubenswrapper[6932]: E0319 11:52:59.498387 6932 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 11:52:59.498445 master-0 kubenswrapper[6932]: I0319 11:52:59.498410 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:52:59.498474 master-0 kubenswrapper[6932]: E0319 11:52:59.498454 6932 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:52:59.498474 master-0 kubenswrapper[6932]: I0319 11:52:59.498449 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:52:59.498526 master-0 kubenswrapper[6932]: I0319 11:52:59.498513 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:52:59.498554 master-0 kubenswrapper[6932]: E0319 11:52:59.498524 6932 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 11:52:59.498580 master-0 kubenswrapper[6932]: I0319 11:52:59.498560 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:52:59.498624 master-0 kubenswrapper[6932]: E0319 11:52:59.498604 6932 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 11:52:59.498675 master-0 kubenswrapper[6932]: I0319 11:52:59.498608 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls\") pod \"dns-operator-9c5679d8f-965np\" (UID: \"22e10648-af7c-409e-b947-570e7d807e05\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:52:59.498675 master-0 kubenswrapper[6932]: E0319 11:52:59.498667 6932 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:52:59.498750 master-0 kubenswrapper[6932]: I0319 11:52:59.498668 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert\") pod \"olm-operator-5c9796789-l9sw9\" (UID: \"716c2176-50f9-4c4f-af0e-4c7973457df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:52:59.498789 master-0 kubenswrapper[6932]: E0319 11:52:59.498746 6932 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 11:52:59.498849 master-0 kubenswrapper[6932]: E0319 11:52:59.498828 6932 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 11:52:59.498881 master-0 kubenswrapper[6932]: E0319 11:52:59.498841 6932 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:52:59.498908 master-0 kubenswrapper[6932]: E0319 11:52:59.498891 6932 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 11:52:59.498952 master-0 kubenswrapper[6932]: E0319 11:52:59.498933 6932 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 11:52:59.499003 master-0 kubenswrapper[6932]: E0319 11:52:59.498984 6932 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 11:52:59.499054 master-0 kubenswrapper[6932]: E0319 11:52:59.499036 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs podName:89cf2ee8-3664-4502-b70c-b7e0a5e92cb7 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:01.499015462 +0000 UTC m=+5.858075724 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-wdwkz" (UID: "89cf2ee8-3664-4502-b70c-b7e0a5e92cb7") : secret "multus-admission-controller-secret" not found Mar 19 11:52:59.499096 master-0 kubenswrapper[6932]: E0319 11:52:59.499070 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls podName:a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:01.499052033 +0000 UTC m=+5.858112265 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-nrtp2" (UID: "a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1") : secret "image-registry-operator-tls" not found Mar 19 11:52:59.499127 master-0 kubenswrapper[6932]: E0319 11:52:59.499093 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics podName:b3de8a1b-a5be-414f-86e8-738e16c8bc97 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:01.499082604 +0000 UTC m=+5.858142846 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-bftt4" (UID: "b3de8a1b-a5be-414f-86e8-738e16c8bc97") : secret "marketplace-operator-metrics" not found Mar 19 11:52:59.499127 master-0 kubenswrapper[6932]: E0319 11:52:59.499114 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert podName:aaaaf539-bf61-44d7-8d47-97535b7aa1ba nodeName:}" failed. No retries permitted until 2026-03-19 11:53:01.499103844 +0000 UTC m=+5.858164156 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-kb5vd" (UID: "aaaaf539-bf61-44d7-8d47-97535b7aa1ba") : secret "performance-addon-operator-webhook-cert" not found Mar 19 11:52:59.499185 master-0 kubenswrapper[6932]: E0319 11:52:59.499132 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert podName:cf08ab4f-c203-4c16-9826-8cc049f4af31 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:01.499122605 +0000 UTC m=+5.858182837 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert") pod "catalog-operator-68f85b4d6c-n5gr9" (UID: "cf08ab4f-c203-4c16-9826-8cc049f4af31") : secret "catalog-operator-serving-cert" not found Mar 19 11:52:59.499185 master-0 kubenswrapper[6932]: E0319 11:52:59.499150 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert podName:333047c4-aeca-410e-9393-ca4e74366921 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:01.499141086 +0000 UTC m=+5.858201398 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert") pod "cluster-version-operator-56d8475767-pk574" (UID: "333047c4-aeca-410e-9393-ca4e74366921") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:52:59.499185 master-0 kubenswrapper[6932]: E0319 11:52:59.499167 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls podName:681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:01.499158216 +0000 UTC m=+5.858218458 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-tkcwh" (UID: "681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9") : secret "cluster-monitoring-operator-tls" not found Mar 19 11:52:59.499280 master-0 kubenswrapper[6932]: E0319 11:52:59.499196 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert podName:e5078f17-bc65-460f-9f18-8c506db6840b nodeName:}" failed. No retries permitted until 2026-03-19 11:53:01.499186337 +0000 UTC m=+5.858246569 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-jq5vq" (UID: "e5078f17-bc65-460f-9f18-8c506db6840b") : secret "package-server-manager-serving-cert" not found Mar 19 11:52:59.499280 master-0 kubenswrapper[6932]: E0319 11:52:59.499213 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls podName:22e10648-af7c-409e-b947-570e7d807e05 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:01.499204668 +0000 UTC m=+5.858264910 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls") pod "dns-operator-9c5679d8f-965np" (UID: "22e10648-af7c-409e-b947-570e7d807e05") : secret "metrics-tls" not found Mar 19 11:52:59.499280 master-0 kubenswrapper[6932]: E0319 11:52:59.499231 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert podName:716c2176-50f9-4c4f-af0e-4c7973457df2 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:01.499222538 +0000 UTC m=+5.858282780 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert") pod "olm-operator-5c9796789-l9sw9" (UID: "716c2176-50f9-4c4f-af0e-4c7973457df2") : secret "olm-operator-serving-cert" not found Mar 19 11:52:59.499280 master-0 kubenswrapper[6932]: E0319 11:52:59.499249 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls podName:aaaaf539-bf61-44d7-8d47-97535b7aa1ba nodeName:}" failed. No retries permitted until 2026-03-19 11:53:01.499240268 +0000 UTC m=+5.858300510 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-kb5vd" (UID: "aaaaf539-bf61-44d7-8d47-97535b7aa1ba") : secret "node-tuning-operator-tls" not found Mar 19 11:52:59.499280 master-0 kubenswrapper[6932]: E0319 11:52:59.499267 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls podName:163d6a3d-0080-4122-bb7a-17f6e63f00f0 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:01.499258449 +0000 UTC m=+5.858318681 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls") pod "ingress-operator-66b84d69b-qrjj4" (UID: "163d6a3d-0080-4122-bb7a-17f6e63f00f0") : secret "metrics-tls" not found Mar 19 11:52:59.499411 master-0 kubenswrapper[6932]: E0319 11:52:59.499312 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs podName:f29b11ce-60e0-46b3-8d28-eea3452513cd nodeName:}" failed. No retries permitted until 2026-03-19 11:53:01.49930213 +0000 UTC m=+5.858362372 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs") pod "network-metrics-daemon-f6wv7" (UID: "f29b11ce-60e0-46b3-8d28-eea3452513cd") : secret "metrics-daemon-secret" not found Mar 19 11:52:59.544690 master-0 kubenswrapper[6932]: I0319 11:52:59.544649 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:52:59.654155 master-0 kubenswrapper[6932]: I0319 11:52:59.653798 6932 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 11:52:59.655298 master-0 kubenswrapper[6932]: E0319 11:52:59.655192 6932 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-storage-version-migrator-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252,Command:[cluster-kube-storage-version-migrator-operator start],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:951ecfeba9b2da4b653034d09275f925396a79c2d8461b8a7c71c776fee67ba0,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qql5t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w_openshift-kube-storage-version-migrator-operator(b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 11:52:59.655298 master-0 kubenswrapper[6932]: E0319 11:52:59.655231 6932 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-scheduler-operator-container,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302,Command:[cluster-kube-scheduler-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:1.31.14,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openshift-kube-scheduler-operator-dddff6458-4wj9n_openshift-kube-scheduler-operator(9b61ea14-a7ea-49f3-9df4-5655765ddf7c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 11:52:59.656610 master-0 kubenswrapper[6932]: E0319 11:52:59.656564 6932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler-operator-container\" with ErrImagePull: \"pull QPS exceeded\"" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" podUID="9b61ea14-a7ea-49f3-9df4-5655765ddf7c" Mar 19 11:52:59.656610 master-0 kubenswrapper[6932]: E0319 11:52:59.656592 6932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" podUID="b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d" Mar 19 11:52:59.890141 master-0 kubenswrapper[6932]: I0319 11:52:59.889981 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:53:00.204868 master-0 kubenswrapper[6932]: I0319 11:53:00.204224 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:53:00.212260 master-0 kubenswrapper[6932]: I0319 11:53:00.212201 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:53:00.572472 master-0 kubenswrapper[6932]: I0319 11:53:00.572342 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:53:00.664753 master-0 kubenswrapper[6932]: E0319 11:53:00.664409 6932 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3062f6485aec4770e60852b535c69a42527b305161fe856499c8658ead6d1e85" Mar 19 11:53:00.664753 master-0 kubenswrapper[6932]: E0319 11:53:00.664614 6932 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:copy-catalogd-manifests,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3062f6485aec4770e60852b535c69a42527b305161fe856499c8658ead6d1e85,Command:[/bin/sh],Args:[-c cp -a /openshift/manifests /operand-assets/catalogd],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:operand-assets,ReadOnly:false,MountPath:/operand-assets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4p4hg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000340000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cluster-olm-operator-67dcd4998-rgbzk_openshift-cluster-olm-operator(6611e325-6152-480c-9c2c-1b503e49ccd2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 11:53:00.667066 master-0 kubenswrapper[6932]: E0319 11:53:00.665882 6932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"copy-catalogd-manifests\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" podUID="6611e325-6152-480c-9c2c-1b503e49ccd2" Mar 19 11:53:01.447062 master-0 kubenswrapper[6932]: E0319 11:53:01.446922 6932 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e" Mar 19 11:53:01.447324 master-0 kubenswrapper[6932]: E0319 11:53:01.447149 6932 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openshift-apiserver-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e,Command:[cluster-openshift-apiserver-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:KUBE_APISERVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p5fnx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openshift-apiserver-operator-d65958b8-6hsqn_openshift-apiserver-operator(66f88242-8b0b-4790-bbb6-445c19b34ee7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 11:53:01.448472 master-0 kubenswrapper[6932]: E0319 11:53:01.448413 6932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-apiserver-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" podUID="66f88242-8b0b-4790-bbb6-445c19b34ee7" Mar 19 11:53:01.523146 master-0 kubenswrapper[6932]: I0319 11:53:01.523010 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:53:01.523146 master-0 kubenswrapper[6932]: I0319 11:53:01.523066 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert\") pod \"olm-operator-5c9796789-l9sw9\" (UID: \"716c2176-50f9-4c4f-af0e-4c7973457df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:53:01.523396 master-0 kubenswrapper[6932]: E0319 11:53:01.523239 6932 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 11:53:01.523396 master-0 kubenswrapper[6932]: I0319 11:53:01.523296 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls\") pod \"dns-operator-9c5679d8f-965np\" (UID: \"22e10648-af7c-409e-b947-570e7d807e05\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:53:01.523396 master-0 kubenswrapper[6932]: E0319 11:53:01.523356 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs podName:f29b11ce-60e0-46b3-8d28-eea3452513cd nodeName:}" failed. No retries permitted until 2026-03-19 11:53:05.52333204 +0000 UTC m=+9.882392352 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs") pod "network-metrics-daemon-f6wv7" (UID: "f29b11ce-60e0-46b3-8d28-eea3452513cd") : secret "metrics-daemon-secret" not found Mar 19 11:53:01.523546 master-0 kubenswrapper[6932]: I0319 11:53:01.523443 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:53:01.523546 master-0 kubenswrapper[6932]: E0319 11:53:01.523455 6932 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:53:01.523546 master-0 kubenswrapper[6932]: I0319 11:53:01.523495 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert\") pod \"catalog-operator-68f85b4d6c-n5gr9\" (UID: \"cf08ab4f-c203-4c16-9826-8cc049f4af31\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:53:01.523546 master-0 kubenswrapper[6932]: E0319 11:53:01.523506 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls podName:22e10648-af7c-409e-b947-570e7d807e05 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:05.523490184 +0000 UTC m=+9.882550406 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls") pod "dns-operator-9c5679d8f-965np" (UID: "22e10648-af7c-409e-b947-570e7d807e05") : secret "metrics-tls" not found Mar 19 11:53:01.523546 master-0 kubenswrapper[6932]: I0319 11:53:01.523531 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:53:01.523546 master-0 kubenswrapper[6932]: E0319 11:53:01.523542 6932 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 11:53:01.523709 master-0 kubenswrapper[6932]: E0319 11:53:01.523564 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert podName:716c2176-50f9-4c4f-af0e-4c7973457df2 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:05.523557785 +0000 UTC m=+9.882618007 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert") pod "olm-operator-5c9796789-l9sw9" (UID: "716c2176-50f9-4c4f-af0e-4c7973457df2") : secret "olm-operator-serving-cert" not found Mar 19 11:53:01.523709 master-0 kubenswrapper[6932]: I0319 11:53:01.523561 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-jq5vq\" (UID: \"e5078f17-bc65-460f-9f18-8c506db6840b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:53:01.523709 master-0 kubenswrapper[6932]: I0319 11:53:01.523591 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:53:01.523709 master-0 kubenswrapper[6932]: I0319 11:53:01.523642 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:53:01.523709 master-0 kubenswrapper[6932]: I0319 11:53:01.523696 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-wdwkz\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:53:01.524179 master-0 kubenswrapper[6932]: E0319 11:53:01.523597 6932 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 11:53:01.524179 master-0 kubenswrapper[6932]: E0319 11:53:01.524038 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics podName:b3de8a1b-a5be-414f-86e8-738e16c8bc97 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:05.524016447 +0000 UTC m=+9.883076779 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-bftt4" (UID: "b3de8a1b-a5be-414f-86e8-738e16c8bc97") : secret "marketplace-operator-metrics" not found Mar 19 11:53:01.524179 master-0 kubenswrapper[6932]: I0319 11:53:01.524063 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:53:01.524179 master-0 kubenswrapper[6932]: I0319 11:53:01.524112 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:53:01.524179 master-0 kubenswrapper[6932]: I0319 11:53:01.524179 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:53:01.525537 master-0 kubenswrapper[6932]: E0319 11:53:01.524192 6932 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 11:53:01.525537 master-0 kubenswrapper[6932]: E0319 11:53:01.524248 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls podName:aaaaf539-bf61-44d7-8d47-97535b7aa1ba nodeName:}" failed. No retries permitted until 2026-03-19 11:53:05.524233242 +0000 UTC m=+9.883293544 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-kb5vd" (UID: "aaaaf539-bf61-44d7-8d47-97535b7aa1ba") : secret "node-tuning-operator-tls" not found Mar 19 11:53:01.525537 master-0 kubenswrapper[6932]: E0319 11:53:01.524287 6932 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 11:53:01.525537 master-0 kubenswrapper[6932]: E0319 11:53:01.524318 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls podName:681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:05.524309323 +0000 UTC m=+9.883369545 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-tkcwh" (UID: "681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9") : secret "cluster-monitoring-operator-tls" not found Mar 19 11:53:01.525537 master-0 kubenswrapper[6932]: E0319 11:53:01.524356 6932 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:53:01.525537 master-0 kubenswrapper[6932]: E0319 11:53:01.524396 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls podName:163d6a3d-0080-4122-bb7a-17f6e63f00f0 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:05.524382845 +0000 UTC m=+9.883443067 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls") pod "ingress-operator-66b84d69b-qrjj4" (UID: "163d6a3d-0080-4122-bb7a-17f6e63f00f0") : secret "metrics-tls" not found Mar 19 11:53:01.525537 master-0 kubenswrapper[6932]: E0319 11:53:01.524522 6932 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 11:53:01.525537 master-0 kubenswrapper[6932]: E0319 11:53:01.524546 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert podName:cf08ab4f-c203-4c16-9826-8cc049f4af31 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:05.524539128 +0000 UTC m=+9.883599350 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert") pod "catalog-operator-68f85b4d6c-n5gr9" (UID: "cf08ab4f-c203-4c16-9826-8cc049f4af31") : secret "catalog-operator-serving-cert" not found Mar 19 11:53:01.525537 master-0 kubenswrapper[6932]: E0319 11:53:01.524585 6932 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 11:53:01.525537 master-0 kubenswrapper[6932]: E0319 11:53:01.524627 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert podName:aaaaf539-bf61-44d7-8d47-97535b7aa1ba nodeName:}" failed. No retries permitted until 2026-03-19 11:53:05.52461999 +0000 UTC m=+9.883680212 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-kb5vd" (UID: "aaaaf539-bf61-44d7-8d47-97535b7aa1ba") : secret "performance-addon-operator-webhook-cert" not found Mar 19 11:53:01.525537 master-0 kubenswrapper[6932]: E0319 11:53:01.524462 6932 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 11:53:01.525537 master-0 kubenswrapper[6932]: E0319 11:53:01.524693 6932 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:53:01.525537 master-0 kubenswrapper[6932]: E0319 11:53:01.524658 6932 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 11:53:01.525537 master-0 kubenswrapper[6932]: E0319 11:53:01.524747 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert podName:e5078f17-bc65-460f-9f18-8c506db6840b nodeName:}" failed. No retries permitted until 2026-03-19 11:53:05.524685062 +0000 UTC m=+9.883745384 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-jq5vq" (UID: "e5078f17-bc65-460f-9f18-8c506db6840b") : secret "package-server-manager-serving-cert" not found Mar 19 11:53:01.525537 master-0 kubenswrapper[6932]: E0319 11:53:01.524771 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert podName:333047c4-aeca-410e-9393-ca4e74366921 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:05.524760973 +0000 UTC m=+9.883821326 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert") pod "cluster-version-operator-56d8475767-pk574" (UID: "333047c4-aeca-410e-9393-ca4e74366921") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:53:01.525537 master-0 kubenswrapper[6932]: E0319 11:53:01.524789 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls podName:a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:05.524778744 +0000 UTC m=+9.883839116 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-nrtp2" (UID: "a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1") : secret "image-registry-operator-tls" not found Mar 19 11:53:01.525537 master-0 kubenswrapper[6932]: E0319 11:53:01.525184 6932 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 11:53:01.525537 master-0 kubenswrapper[6932]: E0319 11:53:01.525226 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs podName:89cf2ee8-3664-4502-b70c-b7e0a5e92cb7 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:05.525216715 +0000 UTC m=+9.884276937 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-wdwkz" (UID: "89cf2ee8-3664-4502-b70c-b7e0a5e92cb7") : secret "multus-admission-controller-secret" not found Mar 19 11:53:01.953652 master-0 kubenswrapper[6932]: I0319 11:53:01.952986 6932 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 11:53:02.574575 master-0 kubenswrapper[6932]: E0319 11:53:02.574494 6932 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage2924083224/1\": happened during read: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71" Mar 19 11:53:02.575742 master-0 kubenswrapper[6932]: E0319 11:53:02.574768 6932 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openshift-controller-manager-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71,Command:[cluster-openshift-controller-manager-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:ROUTE_CONTROLLER_MANAGER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:446bedea4916d3c1ee52be94137e484659e9561bd1de95c8189eee279aae984b,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nds54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openshift-controller-manager-operator-8c94f4649-6ghdm_openshift-controller-manager-operator(e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage2924083224/1\": happened during read: context canceled" logger="UnhandledError" Mar 19 11:53:02.576057 master-0 kubenswrapper[6932]: E0319 11:53:02.575994 6932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-controller-manager-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage2924083224/1\\\": happened during read: context canceled\"" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" podUID="e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf" Mar 19 11:53:02.584452 master-0 kubenswrapper[6932]: E0319 11:53:02.584402 6932 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263" Mar 19 11:53:02.584655 master-0 kubenswrapper[6932]: E0319 11:53:02.584534 6932 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:service-ca-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263,Command:[service-ca-operator operator],Args:[--config=/var/run/configmaps/config/operator-config.yaml -v=2],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{83886080 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v27lg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod service-ca-operator-b865698dc-md7m5_openshift-service-ca-operator(f5d73fef-1414-4b29-97ea-42e1c0b1ef18): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 11:53:02.585799 master-0 kubenswrapper[6932]: E0319 11:53:02.585752 6932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" podUID="f5d73fef-1414-4b29-97ea-42e1c0b1ef18" Mar 19 11:53:02.991107 master-0 kubenswrapper[6932]: I0319 11:53:02.990995 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:53:02.992066 master-0 kubenswrapper[6932]: I0319 11:53:02.991292 6932 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 11:53:02.997783 master-0 kubenswrapper[6932]: I0319 11:53:02.997710 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:53:03.240743 master-0 kubenswrapper[6932]: I0319 11:53:03.240677 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:53:03.275049 master-0 kubenswrapper[6932]: I0319 11:53:03.274923 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:53:03.516592 master-0 kubenswrapper[6932]: E0319 11:53:03.516528 6932 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55" Mar 19 11:53:03.516808 master-0 kubenswrapper[6932]: E0319 11:53:03.516743 6932 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kgz7q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-n52gc_openshift-network-operator(4e2c195f-e97d-4cac-81fc-2d5c551d1c30): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 11:53:03.518072 master-0 kubenswrapper[6932]: E0319 11:53:03.517986 6932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-network-operator/iptables-alerter-n52gc" podUID="4e2c195f-e97d-4cac-81fc-2d5c551d1c30" Mar 19 11:53:03.961910 master-0 kubenswrapper[6932]: I0319 11:53:03.961842 6932 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 11:53:03.961910 master-0 kubenswrapper[6932]: I0319 11:53:03.961903 6932 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 11:53:04.111981 master-0 kubenswrapper[6932]: E0319 11:53:04.111924 6932 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a" Mar 19 11:53:04.112455 master-0 kubenswrapper[6932]: E0319 11:53:04.112132 6932 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:etcd-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a,Command:[cluster-etcd-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml --terminate-on-files=/var/run/secrets/serving-cert/tls.crt --terminate-on-files=/var/run/secrets/serving-cert/tls.key --terminate-on-files=/var/run/secrets/etcd-client/tls.crt --terminate-on-files=/var/run/secrets/etcd-client/tls.key --terminate-on-files=/var/run/configmaps/etcd-ca/ca-bundle.crt --terminate-on-files=/var/run/configmaps/etcd-service-ca/service-ca.crt],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:OPENSHIFT_PROFILE,Value:web,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-ca,ReadOnly:false,MountPath:/var/run/configmaps/etcd-ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-service-ca,ReadOnly:false,MountPath:/var/run/configmaps/etcd-service-ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-client,ReadOnly:false,MountPath:/var/run/secrets/etcd-client,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dnl28,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:30,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod etcd-operator-8544cbcf9c-9w7hc_openshift-etcd-operator(8fe4839d-cef4-4ec9-b146-2ae9b76d8a76): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 11:53:04.114609 master-0 kubenswrapper[6932]: E0319 11:53:04.113968 6932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" podUID="8fe4839d-cef4-4ec9-b146-2ae9b76d8a76" Mar 19 11:53:04.644120 master-0 kubenswrapper[6932]: E0319 11:53:04.643873 6932 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458" Mar 19 11:53:04.644248 master-0 kubenswrapper[6932]: E0319 11:53:04.644194 6932 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-controller-manager-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458,Command:[cluster-kube-controller-manager-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458,ValueFrom:nil,},EnvVar{Name:CLUSTER_POLICY_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d,ValueFrom:nil,},EnvVar{Name:TOOLS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:67c988e079558dc6b20232ebf9a7f7276fee60c756caed584c9715e0bec77a5a,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:1.31.14,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-controller-manager-operator-ff989d6cc-5gvgh_openshift-kube-controller-manager-operator(dbcbba74-ac53-4724-a217-4d9b85e7c1db): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 11:53:04.645858 master-0 kubenswrapper[6932]: E0319 11:53:04.645824 6932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" podUID="dbcbba74-ac53-4724-a217-4d9b85e7c1db" Mar 19 11:53:04.858600 master-0 kubenswrapper[6932]: I0319 11:53:04.855615 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cr8n7"] Mar 19 11:53:04.865341 master-0 kubenswrapper[6932]: W0319 11:53:04.864813 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6230ed8f_4608_4168_8f5a_656f411b0ef7.slice/crio-4b35ea3ef7523bac2219f1d11ac9a4ce57129adbac9b8a1915c2a12e2d7a7c68 WatchSource:0}: Error finding container 4b35ea3ef7523bac2219f1d11ac9a4ce57129adbac9b8a1915c2a12e2d7a7c68: Status 404 returned error can't find the container with id 4b35ea3ef7523bac2219f1d11ac9a4ce57129adbac9b8a1915c2a12e2d7a7c68 Mar 19 11:53:04.971626 master-0 kubenswrapper[6932]: I0319 11:53:04.971551 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cr8n7" event={"ID":"6230ed8f-4608-4168-8f5a-656f411b0ef7","Type":"ContainerStarted","Data":"4b35ea3ef7523bac2219f1d11ac9a4ce57129adbac9b8a1915c2a12e2d7a7c68"} Mar 19 11:53:04.974106 master-0 kubenswrapper[6932]: I0319 11:53:04.974053 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-fx8ng" event={"ID":"2292109e-92a9-4286-858e-dcd2ac083c43","Type":"ContainerStarted","Data":"c087dc1143a8421ab8a00ad29e5840756fe0ea34423617cb84e7aaf43479e0b3"} Mar 19 11:53:04.976161 master-0 kubenswrapper[6932]: I0319 11:53:04.976114 6932 generic.go:334] "Generic (PLEG): container finished" podID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerID="dd209082a1a57426061cd8939f69f004966e7309cc74fc36f14397708b5c4388" exitCode=0 Mar 19 11:53:04.976219 master-0 kubenswrapper[6932]: I0319 11:53:04.976194 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" event={"ID":"a3ceeece-bee9-4fcb-8517-95ebce38e223","Type":"ContainerDied","Data":"dd209082a1a57426061cd8939f69f004966e7309cc74fc36f14397708b5c4388"} Mar 19 11:53:04.978617 master-0 kubenswrapper[6932]: I0319 11:53:04.978575 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" event={"ID":"732989c5-1b89-46f0-9917-b68613f7f005","Type":"ContainerStarted","Data":"4ee16bcaa03f25cf971556786ccb51f285719b794843e45ad52bd8134e676a54"} Mar 19 11:53:05.580326 master-0 kubenswrapper[6932]: I0319 11:53:05.580273 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:53:05.580326 master-0 kubenswrapper[6932]: I0319 11:53:05.580334 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-wdwkz\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: I0319 11:53:05.580355 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: I0319 11:53:05.580374 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: I0319 11:53:05.580398 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: I0319 11:53:05.580423 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: I0319 11:53:05.580449 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls\") pod \"dns-operator-9c5679d8f-965np\" (UID: \"22e10648-af7c-409e-b947-570e7d807e05\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: I0319 11:53:05.580484 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert\") pod \"olm-operator-5c9796789-l9sw9\" (UID: \"716c2176-50f9-4c4f-af0e-4c7973457df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: I0319 11:53:05.580517 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: I0319 11:53:05.580544 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert\") pod \"catalog-operator-68f85b4d6c-n5gr9\" (UID: \"cf08ab4f-c203-4c16-9826-8cc049f4af31\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: I0319 11:53:05.580568 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: I0319 11:53:05.580592 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-jq5vq\" (UID: \"e5078f17-bc65-460f-9f18-8c506db6840b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: I0319 11:53:05.580613 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.580746 6932 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.580794 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert podName:333047c4-aeca-410e-9393-ca4e74366921 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:13.580779762 +0000 UTC m=+17.939839984 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert") pod "cluster-version-operator-56d8475767-pk574" (UID: "333047c4-aeca-410e-9393-ca4e74366921") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.581131 6932 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.581191 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls podName:a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:13.581183741 +0000 UTC m=+17.940243963 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-nrtp2" (UID: "a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1") : secret "image-registry-operator-tls" not found Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.581227 6932 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.581245 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs podName:89cf2ee8-3664-4502-b70c-b7e0a5e92cb7 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:13.581239372 +0000 UTC m=+17.940299594 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-wdwkz" (UID: "89cf2ee8-3664-4502-b70c-b7e0a5e92cb7") : secret "multus-admission-controller-secret" not found Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.581276 6932 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.581292 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls podName:aaaaf539-bf61-44d7-8d47-97535b7aa1ba nodeName:}" failed. No retries permitted until 2026-03-19 11:53:13.581285873 +0000 UTC m=+17.940346095 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-kb5vd" (UID: "aaaaf539-bf61-44d7-8d47-97535b7aa1ba") : secret "node-tuning-operator-tls" not found Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.581322 6932 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.581340 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls podName:681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:13.581334225 +0000 UTC m=+17.940394437 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-tkcwh" (UID: "681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9") : secret "cluster-monitoring-operator-tls" not found Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.581372 6932 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.581392 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls podName:163d6a3d-0080-4122-bb7a-17f6e63f00f0 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:13.581385517 +0000 UTC m=+17.940445739 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls") pod "ingress-operator-66b84d69b-qrjj4" (UID: "163d6a3d-0080-4122-bb7a-17f6e63f00f0") : secret "metrics-tls" not found Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.581425 6932 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.581444 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs podName:f29b11ce-60e0-46b3-8d28-eea3452513cd nodeName:}" failed. No retries permitted until 2026-03-19 11:53:13.581437118 +0000 UTC m=+17.940497330 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs") pod "network-metrics-daemon-f6wv7" (UID: "f29b11ce-60e0-46b3-8d28-eea3452513cd") : secret "metrics-daemon-secret" not found Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.581476 6932 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.581493 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls podName:22e10648-af7c-409e-b947-570e7d807e05 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:13.581487889 +0000 UTC m=+17.940548111 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls") pod "dns-operator-9c5679d8f-965np" (UID: "22e10648-af7c-409e-b947-570e7d807e05") : secret "metrics-tls" not found Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.581534 6932 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.581553 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert podName:716c2176-50f9-4c4f-af0e-4c7973457df2 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:13.58154728 +0000 UTC m=+17.940607502 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert") pod "olm-operator-5c9796789-l9sw9" (UID: "716c2176-50f9-4c4f-af0e-4c7973457df2") : secret "olm-operator-serving-cert" not found Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.581584 6932 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.581599 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics podName:b3de8a1b-a5be-414f-86e8-738e16c8bc97 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:13.581594001 +0000 UTC m=+17.940654223 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-bftt4" (UID: "b3de8a1b-a5be-414f-86e8-738e16c8bc97") : secret "marketplace-operator-metrics" not found Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.581629 6932 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.581645 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert podName:cf08ab4f-c203-4c16-9826-8cc049f4af31 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:13.581639992 +0000 UTC m=+17.940700214 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert") pod "catalog-operator-68f85b4d6c-n5gr9" (UID: "cf08ab4f-c203-4c16-9826-8cc049f4af31") : secret "catalog-operator-serving-cert" not found Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.581672 6932 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.581689 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert podName:aaaaf539-bf61-44d7-8d47-97535b7aa1ba nodeName:}" failed. No retries permitted until 2026-03-19 11:53:13.581682933 +0000 UTC m=+17.940743155 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-kb5vd" (UID: "aaaaf539-bf61-44d7-8d47-97535b7aa1ba") : secret "performance-addon-operator-webhook-cert" not found Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.581719 6932 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 11:53:05.581712 master-0 kubenswrapper[6932]: E0319 11:53:05.581752 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert podName:e5078f17-bc65-460f-9f18-8c506db6840b nodeName:}" failed. No retries permitted until 2026-03-19 11:53:13.581746725 +0000 UTC m=+17.940806937 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-jq5vq" (UID: "e5078f17-bc65-460f-9f18-8c506db6840b") : secret "package-server-manager-serving-cert" not found Mar 19 11:53:05.982376 master-0 kubenswrapper[6932]: I0319 11:53:05.982322 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cr8n7" event={"ID":"6230ed8f-4608-4168-8f5a-656f411b0ef7","Type":"ContainerStarted","Data":"f0de33426b3de859f748102fe5b738a26318a4d5bc54c76dffa506bcd12710ee"} Mar 19 11:53:05.982638 master-0 kubenswrapper[6932]: I0319 11:53:05.982457 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:53:06.325600 master-0 kubenswrapper[6932]: I0319 11:53:06.325301 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4"] Mar 19 11:53:06.325827 master-0 kubenswrapper[6932]: E0319 11:53:06.325771 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0121ab07-b504-4577-bb1b-fef929268726" containerName="prober" Mar 19 11:53:06.325827 master-0 kubenswrapper[6932]: I0319 11:53:06.325808 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="0121ab07-b504-4577-bb1b-fef929268726" containerName="prober" Mar 19 11:53:06.325827 master-0 kubenswrapper[6932]: E0319 11:53:06.325819 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c13ffb3e-ab50-411c-9208-7ba47e8ebc92" containerName="assisted-installer-controller" Mar 19 11:53:06.325827 master-0 kubenswrapper[6932]: I0319 11:53:06.325827 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="c13ffb3e-ab50-411c-9208-7ba47e8ebc92" containerName="assisted-installer-controller" Mar 19 11:53:06.325962 master-0 kubenswrapper[6932]: I0319 11:53:06.325909 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="0121ab07-b504-4577-bb1b-fef929268726" containerName="prober" Mar 19 11:53:06.325962 master-0 kubenswrapper[6932]: I0319 11:53:06.325923 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="c13ffb3e-ab50-411c-9208-7ba47e8ebc92" containerName="assisted-installer-controller" Mar 19 11:53:06.326211 master-0 kubenswrapper[6932]: I0319 11:53:06.326181 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" Mar 19 11:53:06.380861 master-0 kubenswrapper[6932]: I0319 11:53:06.377892 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4"] Mar 19 11:53:06.395441 master-0 kubenswrapper[6932]: I0319 11:53:06.395383 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgzdh\" (UniqueName: \"kubernetes.io/projected/d625c81e-01cc-424a-997d-546a5204a72b-kube-api-access-tgzdh\") pod \"csi-snapshot-controller-64854d9cff-764k4\" (UID: \"d625c81e-01cc-424a-997d-546a5204a72b\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" Mar 19 11:53:06.497126 master-0 kubenswrapper[6932]: I0319 11:53:06.496974 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgzdh\" (UniqueName: \"kubernetes.io/projected/d625c81e-01cc-424a-997d-546a5204a72b-kube-api-access-tgzdh\") pod \"csi-snapshot-controller-64854d9cff-764k4\" (UID: \"d625c81e-01cc-424a-997d-546a5204a72b\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" Mar 19 11:53:06.518776 master-0 kubenswrapper[6932]: I0319 11:53:06.515705 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgzdh\" (UniqueName: \"kubernetes.io/projected/d625c81e-01cc-424a-997d-546a5204a72b-kube-api-access-tgzdh\") pod \"csi-snapshot-controller-64854d9cff-764k4\" (UID: \"d625c81e-01cc-424a-997d-546a5204a72b\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" Mar 19 11:53:06.639602 master-0 kubenswrapper[6932]: I0319 11:53:06.639460 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" Mar 19 11:53:06.801740 master-0 kubenswrapper[6932]: I0319 11:53:06.801673 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4"] Mar 19 11:53:06.944161 master-0 kubenswrapper[6932]: I0319 11:53:06.944091 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:53:06.944402 master-0 kubenswrapper[6932]: I0319 11:53:06.944247 6932 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 11:53:06.949454 master-0 kubenswrapper[6932]: I0319 11:53:06.949423 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:53:07.999772 master-0 kubenswrapper[6932]: I0319 11:53:07.999341 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" event={"ID":"d625c81e-01cc-424a-997d-546a5204a72b","Type":"ContainerStarted","Data":"402476363b5df4845bdf76440169d41c48c7c304f89463a3160ab10c4b9c45da"} Mar 19 11:53:08.001504 master-0 kubenswrapper[6932]: I0319 11:53:08.001051 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" event={"ID":"a3ceeece-bee9-4fcb-8517-95ebce38e223","Type":"ContainerStarted","Data":"c2c2b96a5faf69402dfe85ec6b2718eb42ca2ecf78927fa96ef82a61fc3c2da6"} Mar 19 11:53:08.001504 master-0 kubenswrapper[6932]: I0319 11:53:08.001349 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:53:08.151510 master-0 kubenswrapper[6932]: I0319 11:53:08.151455 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:53:08.160194 master-0 kubenswrapper[6932]: I0319 11:53:08.160159 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:53:08.160544 master-0 kubenswrapper[6932]: I0319 11:53:08.160527 6932 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 11:53:08.160629 master-0 kubenswrapper[6932]: I0319 11:53:08.160619 6932 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 11:53:08.185887 master-0 kubenswrapper[6932]: I0319 11:53:08.185833 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:53:08.851882 master-0 kubenswrapper[6932]: I0319 11:53:08.851806 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:53:08.857578 master-0 kubenswrapper[6932]: I0319 11:53:08.857530 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:53:09.005132 master-0 kubenswrapper[6932]: I0319 11:53:09.005065 6932 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 11:53:09.010258 master-0 kubenswrapper[6932]: I0319 11:53:09.010225 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:53:09.887237 master-0 kubenswrapper[6932]: I0319 11:53:09.887039 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:53:09.912531 master-0 kubenswrapper[6932]: I0319 11:53:09.912455 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:53:10.011165 master-0 kubenswrapper[6932]: I0319 11:53:10.011084 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" event={"ID":"d625c81e-01cc-424a-997d-546a5204a72b","Type":"ContainerStarted","Data":"e21a965ed4cb2dd18edb22058723998ac546681c370497fc8735a2d87bc17971"} Mar 19 11:53:10.030841 master-0 kubenswrapper[6932]: I0319 11:53:10.030627 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" podStartSLOduration=1.6213762969999999 podStartE2EDuration="4.030591555s" podCreationTimestamp="2026-03-19 11:53:06 +0000 UTC" firstStartedPulling="2026-03-19 11:53:07.182068569 +0000 UTC m=+11.541128791" lastFinishedPulling="2026-03-19 11:53:09.591283817 +0000 UTC m=+13.950344049" observedRunningTime="2026-03-19 11:53:10.030164895 +0000 UTC m=+14.389225127" watchObservedRunningTime="2026-03-19 11:53:10.030591555 +0000 UTC m=+14.389651777" Mar 19 11:53:11.550772 master-0 kubenswrapper[6932]: I0319 11:53:11.550682 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:53:13.030142 master-0 kubenswrapper[6932]: I0319 11:53:13.029539 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" event={"ID":"b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d","Type":"ContainerStarted","Data":"ffd01994498e412e963b01ac06f0e6ad28082a18471897dde077305cc7888366"} Mar 19 11:53:13.581497 master-0 kubenswrapper[6932]: I0319 11:53:13.581288 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:53:13.581497 master-0 kubenswrapper[6932]: I0319 11:53:13.581394 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:53:13.581497 master-0 kubenswrapper[6932]: I0319 11:53:13.581430 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-wdwkz\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:53:13.581497 master-0 kubenswrapper[6932]: I0319 11:53:13.581449 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:53:13.581497 master-0 kubenswrapper[6932]: E0319 11:53:13.581511 6932 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:53:13.581497 master-0 kubenswrapper[6932]: E0319 11:53:13.581593 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert podName:333047c4-aeca-410e-9393-ca4e74366921 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:29.581572533 +0000 UTC m=+33.940632755 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert") pod "cluster-version-operator-56d8475767-pk574" (UID: "333047c4-aeca-410e-9393-ca4e74366921") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:53:13.581497 master-0 kubenswrapper[6932]: I0319 11:53:13.581639 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:53:13.581497 master-0 kubenswrapper[6932]: E0319 11:53:13.581685 6932 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 11:53:13.581497 master-0 kubenswrapper[6932]: E0319 11:53:13.581805 6932 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 11:53:13.581497 master-0 kubenswrapper[6932]: E0319 11:53:13.581859 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls podName:a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:29.581824249 +0000 UTC m=+33.940884661 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-nrtp2" (UID: "a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1") : secret "image-registry-operator-tls" not found Mar 19 11:53:13.581497 master-0 kubenswrapper[6932]: E0319 11:53:13.581889 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls podName:681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:29.58187815 +0000 UTC m=+33.940938382 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-tkcwh" (UID: "681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9") : secret "cluster-monitoring-operator-tls" not found Mar 19 11:53:13.581497 master-0 kubenswrapper[6932]: I0319 11:53:13.581943 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:53:13.581497 master-0 kubenswrapper[6932]: E0319 11:53:13.581948 6932 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 11:53:13.581497 master-0 kubenswrapper[6932]: E0319 11:53:13.581969 6932 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 11:53:13.581497 master-0 kubenswrapper[6932]: E0319 11:53:13.582055 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs podName:89cf2ee8-3664-4502-b70c-b7e0a5e92cb7 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:29.582031154 +0000 UTC m=+33.941091366 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-wdwkz" (UID: "89cf2ee8-3664-4502-b70c-b7e0a5e92cb7") : secret "multus-admission-controller-secret" not found Mar 19 11:53:13.581497 master-0 kubenswrapper[6932]: E0319 11:53:13.582103 6932 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:53:13.581497 master-0 kubenswrapper[6932]: E0319 11:53:13.582125 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls podName:aaaaf539-bf61-44d7-8d47-97535b7aa1ba nodeName:}" failed. No retries permitted until 2026-03-19 11:53:29.582105705 +0000 UTC m=+33.941165927 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-kb5vd" (UID: "aaaaf539-bf61-44d7-8d47-97535b7aa1ba") : secret "node-tuning-operator-tls" not found Mar 19 11:53:13.581497 master-0 kubenswrapper[6932]: E0319 11:53:13.582152 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls podName:163d6a3d-0080-4122-bb7a-17f6e63f00f0 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:29.582135126 +0000 UTC m=+33.941195578 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls") pod "ingress-operator-66b84d69b-qrjj4" (UID: "163d6a3d-0080-4122-bb7a-17f6e63f00f0") : secret "metrics-tls" not found Mar 19 11:53:13.581497 master-0 kubenswrapper[6932]: I0319 11:53:13.582181 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:53:13.581497 master-0 kubenswrapper[6932]: E0319 11:53:13.582275 6932 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 11:53:13.581497 master-0 kubenswrapper[6932]: E0319 11:53:13.582318 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs podName:f29b11ce-60e0-46b3-8d28-eea3452513cd nodeName:}" failed. No retries permitted until 2026-03-19 11:53:29.58230848 +0000 UTC m=+33.941368922 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs") pod "network-metrics-daemon-f6wv7" (UID: "f29b11ce-60e0-46b3-8d28-eea3452513cd") : secret "metrics-daemon-secret" not found Mar 19 11:53:13.683427 master-0 kubenswrapper[6932]: I0319 11:53:13.683384 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls\") pod \"dns-operator-9c5679d8f-965np\" (UID: \"22e10648-af7c-409e-b947-570e7d807e05\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:53:13.683427 master-0 kubenswrapper[6932]: I0319 11:53:13.683433 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert\") pod \"olm-operator-5c9796789-l9sw9\" (UID: \"716c2176-50f9-4c4f-af0e-4c7973457df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:53:13.683746 master-0 kubenswrapper[6932]: E0319 11:53:13.683593 6932 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:53:13.683746 master-0 kubenswrapper[6932]: E0319 11:53:13.683698 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls podName:22e10648-af7c-409e-b947-570e7d807e05 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:29.683667618 +0000 UTC m=+34.042727850 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls") pod "dns-operator-9c5679d8f-965np" (UID: "22e10648-af7c-409e-b947-570e7d807e05") : secret "metrics-tls" not found Mar 19 11:53:13.683922 master-0 kubenswrapper[6932]: E0319 11:53:13.683886 6932 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 11:53:13.683922 master-0 kubenswrapper[6932]: I0319 11:53:13.683898 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:53:13.684008 master-0 kubenswrapper[6932]: I0319 11:53:13.683941 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert\") pod \"catalog-operator-68f85b4d6c-n5gr9\" (UID: \"cf08ab4f-c203-4c16-9826-8cc049f4af31\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:53:13.684008 master-0 kubenswrapper[6932]: E0319 11:53:13.683959 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert podName:716c2176-50f9-4c4f-af0e-4c7973457df2 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:29.683939024 +0000 UTC m=+34.042999286 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert") pod "olm-operator-5c9796789-l9sw9" (UID: "716c2176-50f9-4c4f-af0e-4c7973457df2") : secret "olm-operator-serving-cert" not found Mar 19 11:53:13.684008 master-0 kubenswrapper[6932]: I0319 11:53:13.683984 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:53:13.684008 master-0 kubenswrapper[6932]: E0319 11:53:13.684007 6932 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 11:53:13.684201 master-0 kubenswrapper[6932]: I0319 11:53:13.684016 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-jq5vq\" (UID: \"e5078f17-bc65-460f-9f18-8c506db6840b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:53:13.684201 master-0 kubenswrapper[6932]: E0319 11:53:13.684023 6932 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 11:53:13.684201 master-0 kubenswrapper[6932]: E0319 11:53:13.684037 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert podName:cf08ab4f-c203-4c16-9826-8cc049f4af31 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:29.684028717 +0000 UTC m=+34.043088939 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert") pod "catalog-operator-68f85b4d6c-n5gr9" (UID: "cf08ab4f-c203-4c16-9826-8cc049f4af31") : secret "catalog-operator-serving-cert" not found Mar 19 11:53:13.684317 master-0 kubenswrapper[6932]: E0319 11:53:13.684249 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics podName:b3de8a1b-a5be-414f-86e8-738e16c8bc97 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:29.684229441 +0000 UTC m=+34.043289683 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-bftt4" (UID: "b3de8a1b-a5be-414f-86e8-738e16c8bc97") : secret "marketplace-operator-metrics" not found Mar 19 11:53:13.684317 master-0 kubenswrapper[6932]: E0319 11:53:13.684276 6932 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 11:53:13.684392 master-0 kubenswrapper[6932]: E0319 11:53:13.684349 6932 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 11:53:13.684392 master-0 kubenswrapper[6932]: E0319 11:53:13.684391 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert podName:e5078f17-bc65-460f-9f18-8c506db6840b nodeName:}" failed. No retries permitted until 2026-03-19 11:53:29.684378815 +0000 UTC m=+34.043439157 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-jq5vq" (UID: "e5078f17-bc65-460f-9f18-8c506db6840b") : secret "package-server-manager-serving-cert" not found Mar 19 11:53:13.684535 master-0 kubenswrapper[6932]: E0319 11:53:13.684413 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert podName:aaaaf539-bf61-44d7-8d47-97535b7aa1ba nodeName:}" failed. No retries permitted until 2026-03-19 11:53:29.684402785 +0000 UTC m=+34.043463137 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-kb5vd" (UID: "aaaaf539-bf61-44d7-8d47-97535b7aa1ba") : secret "performance-addon-operator-webhook-cert" not found Mar 19 11:53:13.791532 master-0 kubenswrapper[6932]: I0319 11:53:13.791466 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-8487694857-jls48"] Mar 19 11:53:13.792124 master-0 kubenswrapper[6932]: I0319 11:53:13.792096 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-8487694857-jls48" Mar 19 11:53:13.796602 master-0 kubenswrapper[6932]: I0319 11:53:13.796567 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 11:53:13.798050 master-0 kubenswrapper[6932]: I0319 11:53:13.798015 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 11:53:13.803565 master-0 kubenswrapper[6932]: I0319 11:53:13.803519 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-8487694857-jls48"] Mar 19 11:53:13.886502 master-0 kubenswrapper[6932]: I0319 11:53:13.886410 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdbjk\" (UniqueName: \"kubernetes.io/projected/f3b6a8b5-bcaa-47f6-a9d5-6186981191d5-kube-api-access-jdbjk\") pod \"migrator-8487694857-jls48\" (UID: \"f3b6a8b5-bcaa-47f6-a9d5-6186981191d5\") " pod="openshift-kube-storage-version-migrator/migrator-8487694857-jls48" Mar 19 11:53:13.988100 master-0 kubenswrapper[6932]: I0319 11:53:13.987946 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdbjk\" (UniqueName: \"kubernetes.io/projected/f3b6a8b5-bcaa-47f6-a9d5-6186981191d5-kube-api-access-jdbjk\") pod \"migrator-8487694857-jls48\" (UID: \"f3b6a8b5-bcaa-47f6-a9d5-6186981191d5\") " pod="openshift-kube-storage-version-migrator/migrator-8487694857-jls48" Mar 19 11:53:14.007555 master-0 kubenswrapper[6932]: I0319 11:53:14.007492 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdbjk\" (UniqueName: \"kubernetes.io/projected/f3b6a8b5-bcaa-47f6-a9d5-6186981191d5-kube-api-access-jdbjk\") pod \"migrator-8487694857-jls48\" (UID: \"f3b6a8b5-bcaa-47f6-a9d5-6186981191d5\") " pod="openshift-kube-storage-version-migrator/migrator-8487694857-jls48" Mar 19 11:53:14.037431 master-0 kubenswrapper[6932]: I0319 11:53:14.037363 6932 generic.go:334] "Generic (PLEG): container finished" podID="6611e325-6152-480c-9c2c-1b503e49ccd2" containerID="4eed89e87867c4e687c139b0ec5fb8c1e755d1dd5bc8ea8ed4c3c3f5eeb362b4" exitCode=0 Mar 19 11:53:14.037431 master-0 kubenswrapper[6932]: I0319 11:53:14.037412 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" event={"ID":"6611e325-6152-480c-9c2c-1b503e49ccd2","Type":"ContainerDied","Data":"4eed89e87867c4e687c139b0ec5fb8c1e755d1dd5bc8ea8ed4c3c3f5eeb362b4"} Mar 19 11:53:14.105716 master-0 kubenswrapper[6932]: I0319 11:53:14.105633 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-8487694857-jls48" Mar 19 11:53:14.304800 master-0 kubenswrapper[6932]: I0319 11:53:14.304440 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-8487694857-jls48"] Mar 19 11:53:15.043482 master-0 kubenswrapper[6932]: I0319 11:53:15.043336 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-8487694857-jls48" event={"ID":"f3b6a8b5-bcaa-47f6-a9d5-6186981191d5","Type":"ContainerStarted","Data":"26bcc676a684bba59ce239a7b0c6d837715bffea1d6d9d661570c6d71c3af31c"} Mar 19 11:53:16.049669 master-0 kubenswrapper[6932]: I0319 11:53:16.049608 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" event={"ID":"9b61ea14-a7ea-49f3-9df4-5655765ddf7c","Type":"ContainerStarted","Data":"a63fe33504bcc71f9b4e0c9d251065dc432b3176905c1514b755fad213c3ed25"} Mar 19 11:53:16.052060 master-0 kubenswrapper[6932]: I0319 11:53:16.051976 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" event={"ID":"f5d73fef-1414-4b29-97ea-42e1c0b1ef18","Type":"ContainerStarted","Data":"a00e4976297d868e9d1a74ee69351e1ac6225f1b3fff400804a95076bf8deddd"} Mar 19 11:53:17.056398 master-0 kubenswrapper[6932]: I0319 11:53:17.056346 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" event={"ID":"66f88242-8b0b-4790-bbb6-445c19b34ee7","Type":"ContainerStarted","Data":"f48ebfe02dc1f93683f1d2eea873f5d0c2c3081e3483e2d09faebd411fa396ef"} Mar 19 11:53:17.059064 master-0 kubenswrapper[6932]: I0319 11:53:17.058556 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-8487694857-jls48" event={"ID":"f3b6a8b5-bcaa-47f6-a9d5-6186981191d5","Type":"ContainerStarted","Data":"24691df575b5898e612b8d72caa173c79bcb99501d68989bf1a8ed1dcc119015"} Mar 19 11:53:17.060124 master-0 kubenswrapper[6932]: I0319 11:53:17.060042 6932 generic.go:334] "Generic (PLEG): container finished" podID="6611e325-6152-480c-9c2c-1b503e49ccd2" containerID="f631f12266f4b047459015dd86cbaf1ce99efc325ca568b49d919857a5c8c1d9" exitCode=0 Mar 19 11:53:17.060124 master-0 kubenswrapper[6932]: I0319 11:53:17.060096 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" event={"ID":"6611e325-6152-480c-9c2c-1b503e49ccd2","Type":"ContainerDied","Data":"f631f12266f4b047459015dd86cbaf1ce99efc325ca568b49d919857a5c8c1d9"} Mar 19 11:53:18.066235 master-0 kubenswrapper[6932]: I0319 11:53:18.065894 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-8487694857-jls48" event={"ID":"f3b6a8b5-bcaa-47f6-a9d5-6186981191d5","Type":"ContainerStarted","Data":"985d0942af53c6ade272c27e5b3cf61b415457195e0e86943c020cfd7b69e7ef"} Mar 19 11:53:18.068329 master-0 kubenswrapper[6932]: I0319 11:53:18.068290 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" event={"ID":"dbcbba74-ac53-4724-a217-4d9b85e7c1db","Type":"ContainerStarted","Data":"b6e56f4e0942ab58cf693081930c0b921d6a49180ecc1e1f47356ba56a945538"} Mar 19 11:53:18.105144 master-0 kubenswrapper[6932]: I0319 11:53:18.105061 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-8487694857-jls48" podStartSLOduration=2.747340918 podStartE2EDuration="5.105038768s" podCreationTimestamp="2026-03-19 11:53:13 +0000 UTC" firstStartedPulling="2026-03-19 11:53:14.317109261 +0000 UTC m=+18.676169483" lastFinishedPulling="2026-03-19 11:53:16.674807111 +0000 UTC m=+21.033867333" observedRunningTime="2026-03-19 11:53:18.084080567 +0000 UTC m=+22.443140789" watchObservedRunningTime="2026-03-19 11:53:18.105038768 +0000 UTC m=+22.464099000" Mar 19 11:53:18.931566 master-0 kubenswrapper[6932]: I0319 11:53:18.931512 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-79bc6b8d76-lzfbh"] Mar 19 11:53:18.932507 master-0 kubenswrapper[6932]: I0319 11:53:18.932486 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-79bc6b8d76-lzfbh" Mar 19 11:53:18.937845 master-0 kubenswrapper[6932]: I0319 11:53:18.936821 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 11:53:18.937845 master-0 kubenswrapper[6932]: I0319 11:53:18.937235 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 11:53:18.937845 master-0 kubenswrapper[6932]: I0319 11:53:18.937367 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 11:53:18.938184 master-0 kubenswrapper[6932]: I0319 11:53:18.937873 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 11:53:18.954983 master-0 kubenswrapper[6932]: I0319 11:53:18.954030 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-79bc6b8d76-lzfbh"] Mar 19 11:53:19.063628 master-0 kubenswrapper[6932]: I0319 11:53:19.063569 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5th4l\" (UniqueName: \"kubernetes.io/projected/6e76fc3f-39a4-4f99-8603-38a94da6ea8e-kube-api-access-5th4l\") pod \"service-ca-79bc6b8d76-lzfbh\" (UID: \"6e76fc3f-39a4-4f99-8603-38a94da6ea8e\") " pod="openshift-service-ca/service-ca-79bc6b8d76-lzfbh" Mar 19 11:53:19.063883 master-0 kubenswrapper[6932]: I0319 11:53:19.063695 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6e76fc3f-39a4-4f99-8603-38a94da6ea8e-signing-cabundle\") pod \"service-ca-79bc6b8d76-lzfbh\" (UID: \"6e76fc3f-39a4-4f99-8603-38a94da6ea8e\") " pod="openshift-service-ca/service-ca-79bc6b8d76-lzfbh" Mar 19 11:53:19.063883 master-0 kubenswrapper[6932]: I0319 11:53:19.063719 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6e76fc3f-39a4-4f99-8603-38a94da6ea8e-signing-key\") pod \"service-ca-79bc6b8d76-lzfbh\" (UID: \"6e76fc3f-39a4-4f99-8603-38a94da6ea8e\") " pod="openshift-service-ca/service-ca-79bc6b8d76-lzfbh" Mar 19 11:53:19.165637 master-0 kubenswrapper[6932]: I0319 11:53:19.165507 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6e76fc3f-39a4-4f99-8603-38a94da6ea8e-signing-cabundle\") pod \"service-ca-79bc6b8d76-lzfbh\" (UID: \"6e76fc3f-39a4-4f99-8603-38a94da6ea8e\") " pod="openshift-service-ca/service-ca-79bc6b8d76-lzfbh" Mar 19 11:53:19.166172 master-0 kubenswrapper[6932]: I0319 11:53:19.166055 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6e76fc3f-39a4-4f99-8603-38a94da6ea8e-signing-key\") pod \"service-ca-79bc6b8d76-lzfbh\" (UID: \"6e76fc3f-39a4-4f99-8603-38a94da6ea8e\") " pod="openshift-service-ca/service-ca-79bc6b8d76-lzfbh" Mar 19 11:53:19.166172 master-0 kubenswrapper[6932]: I0319 11:53:19.166140 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5th4l\" (UniqueName: \"kubernetes.io/projected/6e76fc3f-39a4-4f99-8603-38a94da6ea8e-kube-api-access-5th4l\") pod \"service-ca-79bc6b8d76-lzfbh\" (UID: \"6e76fc3f-39a4-4f99-8603-38a94da6ea8e\") " pod="openshift-service-ca/service-ca-79bc6b8d76-lzfbh" Mar 19 11:53:19.167772 master-0 kubenswrapper[6932]: I0319 11:53:19.166862 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6e76fc3f-39a4-4f99-8603-38a94da6ea8e-signing-cabundle\") pod \"service-ca-79bc6b8d76-lzfbh\" (UID: \"6e76fc3f-39a4-4f99-8603-38a94da6ea8e\") " pod="openshift-service-ca/service-ca-79bc6b8d76-lzfbh" Mar 19 11:53:19.176385 master-0 kubenswrapper[6932]: I0319 11:53:19.176322 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6e76fc3f-39a4-4f99-8603-38a94da6ea8e-signing-key\") pod \"service-ca-79bc6b8d76-lzfbh\" (UID: \"6e76fc3f-39a4-4f99-8603-38a94da6ea8e\") " pod="openshift-service-ca/service-ca-79bc6b8d76-lzfbh" Mar 19 11:53:19.184750 master-0 kubenswrapper[6932]: I0319 11:53:19.184688 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5th4l\" (UniqueName: \"kubernetes.io/projected/6e76fc3f-39a4-4f99-8603-38a94da6ea8e-kube-api-access-5th4l\") pod \"service-ca-79bc6b8d76-lzfbh\" (UID: \"6e76fc3f-39a4-4f99-8603-38a94da6ea8e\") " pod="openshift-service-ca/service-ca-79bc6b8d76-lzfbh" Mar 19 11:53:19.298885 master-0 kubenswrapper[6932]: I0319 11:53:19.298800 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-79bc6b8d76-lzfbh" Mar 19 11:53:20.483048 master-0 kubenswrapper[6932]: I0319 11:53:20.482666 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-79bc6b8d76-lzfbh"] Mar 19 11:53:21.085781 master-0 kubenswrapper[6932]: I0319 11:53:21.085334 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" event={"ID":"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf","Type":"ContainerStarted","Data":"3ab6a68db657d0e7924cc47a81bc9831d8055a58f93210e34c6ef5c5b5597505"} Mar 19 11:53:21.086763 master-0 kubenswrapper[6932]: I0319 11:53:21.086707 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" event={"ID":"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76","Type":"ContainerStarted","Data":"b1921d5234eb4af4d7731c20be87a9595434841b33d272f8f2c3ade584fe4c62"} Mar 19 11:53:21.088477 master-0 kubenswrapper[6932]: I0319 11:53:21.088447 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" event={"ID":"6611e325-6152-480c-9c2c-1b503e49ccd2","Type":"ContainerStarted","Data":"5b56b51126590bf802dd88d10f125adb62528aa19311215ff5bc2461894ca90f"} Mar 19 11:53:21.089800 master-0 kubenswrapper[6932]: I0319 11:53:21.089758 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-79bc6b8d76-lzfbh" event={"ID":"6e76fc3f-39a4-4f99-8603-38a94da6ea8e","Type":"ContainerStarted","Data":"16c227dc0b0576e898d8700943b55acbed9d8d9f77d6c9678d33a3bca8e15f98"} Mar 19 11:53:21.089870 master-0 kubenswrapper[6932]: I0319 11:53:21.089805 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-79bc6b8d76-lzfbh" event={"ID":"6e76fc3f-39a4-4f99-8603-38a94da6ea8e","Type":"ContainerStarted","Data":"a1c364bd3d663a56cc2f90bf6e8ea8c50127add36b90978697972f8218a89ed7"} Mar 19 11:53:21.169171 master-0 kubenswrapper[6932]: I0319 11:53:21.169096 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-79bc6b8d76-lzfbh" podStartSLOduration=3.169076699 podStartE2EDuration="3.169076699s" podCreationTimestamp="2026-03-19 11:53:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:53:21.145419228 +0000 UTC m=+25.504479450" watchObservedRunningTime="2026-03-19 11:53:21.169076699 +0000 UTC m=+25.528136921" Mar 19 11:53:22.433446 master-0 kubenswrapper[6932]: I0319 11:53:22.433389 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-f5df8899c-f4z5p"] Mar 19 11:53:22.434377 master-0 kubenswrapper[6932]: I0319 11:53:22.433942 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" Mar 19 11:53:22.437226 master-0 kubenswrapper[6932]: I0319 11:53:22.437186 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 11:53:22.437956 master-0 kubenswrapper[6932]: I0319 11:53:22.437936 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 11:53:22.438135 master-0 kubenswrapper[6932]: I0319 11:53:22.438119 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 11:53:22.438305 master-0 kubenswrapper[6932]: I0319 11:53:22.438290 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 11:53:22.441582 master-0 kubenswrapper[6932]: I0319 11:53:22.441557 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 11:53:22.442635 master-0 kubenswrapper[6932]: I0319 11:53:22.442614 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 11:53:22.463545 master-0 kubenswrapper[6932]: I0319 11:53:22.463506 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f5df8899c-f4z5p"] Mar 19 11:53:22.535286 master-0 kubenswrapper[6932]: I0319 11:53:22.535244 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-client-ca\") pod \"controller-manager-f5df8899c-f4z5p\" (UID: \"79de344d-db63-45a0-8494-37dbb087b274\") " pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" Mar 19 11:53:22.535489 master-0 kubenswrapper[6932]: I0319 11:53:22.535323 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-config\") pod \"controller-manager-f5df8899c-f4z5p\" (UID: \"79de344d-db63-45a0-8494-37dbb087b274\") " pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" Mar 19 11:53:22.535489 master-0 kubenswrapper[6932]: I0319 11:53:22.535388 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-proxy-ca-bundles\") pod \"controller-manager-f5df8899c-f4z5p\" (UID: \"79de344d-db63-45a0-8494-37dbb087b274\") " pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" Mar 19 11:53:22.535489 master-0 kubenswrapper[6932]: I0319 11:53:22.535431 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7crf\" (UniqueName: \"kubernetes.io/projected/79de344d-db63-45a0-8494-37dbb087b274-kube-api-access-f7crf\") pod \"controller-manager-f5df8899c-f4z5p\" (UID: \"79de344d-db63-45a0-8494-37dbb087b274\") " pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" Mar 19 11:53:22.535489 master-0 kubenswrapper[6932]: I0319 11:53:22.535460 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79de344d-db63-45a0-8494-37dbb087b274-serving-cert\") pod \"controller-manager-f5df8899c-f4z5p\" (UID: \"79de344d-db63-45a0-8494-37dbb087b274\") " pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" Mar 19 11:53:22.636535 master-0 kubenswrapper[6932]: I0319 11:53:22.636467 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-proxy-ca-bundles\") pod \"controller-manager-f5df8899c-f4z5p\" (UID: \"79de344d-db63-45a0-8494-37dbb087b274\") " pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" Mar 19 11:53:22.636535 master-0 kubenswrapper[6932]: I0319 11:53:22.636531 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7crf\" (UniqueName: \"kubernetes.io/projected/79de344d-db63-45a0-8494-37dbb087b274-kube-api-access-f7crf\") pod \"controller-manager-f5df8899c-f4z5p\" (UID: \"79de344d-db63-45a0-8494-37dbb087b274\") " pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" Mar 19 11:53:22.636839 master-0 kubenswrapper[6932]: E0319 11:53:22.636621 6932 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Mar 19 11:53:22.636839 master-0 kubenswrapper[6932]: I0319 11:53:22.636673 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79de344d-db63-45a0-8494-37dbb087b274-serving-cert\") pod \"controller-manager-f5df8899c-f4z5p\" (UID: \"79de344d-db63-45a0-8494-37dbb087b274\") " pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" Mar 19 11:53:22.636839 master-0 kubenswrapper[6932]: E0319 11:53:22.636798 6932 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:53:22.636839 master-0 kubenswrapper[6932]: E0319 11:53:22.636837 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79de344d-db63-45a0-8494-37dbb087b274-serving-cert podName:79de344d-db63-45a0-8494-37dbb087b274 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:23.136823391 +0000 UTC m=+27.495883613 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/79de344d-db63-45a0-8494-37dbb087b274-serving-cert") pod "controller-manager-f5df8899c-f4z5p" (UID: "79de344d-db63-45a0-8494-37dbb087b274") : secret "serving-cert" not found Mar 19 11:53:22.637036 master-0 kubenswrapper[6932]: I0319 11:53:22.636946 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-client-ca\") pod \"controller-manager-f5df8899c-f4z5p\" (UID: \"79de344d-db63-45a0-8494-37dbb087b274\") " pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" Mar 19 11:53:22.637079 master-0 kubenswrapper[6932]: E0319 11:53:22.636966 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-proxy-ca-bundles podName:79de344d-db63-45a0-8494-37dbb087b274 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:23.136957503 +0000 UTC m=+27.496017725 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-proxy-ca-bundles") pod "controller-manager-f5df8899c-f4z5p" (UID: "79de344d-db63-45a0-8494-37dbb087b274") : configmap "openshift-global-ca" not found Mar 19 11:53:22.637114 master-0 kubenswrapper[6932]: E0319 11:53:22.637053 6932 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 19 11:53:22.637149 master-0 kubenswrapper[6932]: E0319 11:53:22.637122 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-client-ca podName:79de344d-db63-45a0-8494-37dbb087b274 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:23.137114987 +0000 UTC m=+27.496175209 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-client-ca") pod "controller-manager-f5df8899c-f4z5p" (UID: "79de344d-db63-45a0-8494-37dbb087b274") : configmap "client-ca" not found Mar 19 11:53:22.637149 master-0 kubenswrapper[6932]: I0319 11:53:22.637118 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-config\") pod \"controller-manager-f5df8899c-f4z5p\" (UID: \"79de344d-db63-45a0-8494-37dbb087b274\") " pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" Mar 19 11:53:22.637301 master-0 kubenswrapper[6932]: E0319 11:53:22.637187 6932 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Mar 19 11:53:22.637301 master-0 kubenswrapper[6932]: E0319 11:53:22.637236 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-config podName:79de344d-db63-45a0-8494-37dbb087b274 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:23.137222949 +0000 UTC m=+27.496283171 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-config") pod "controller-manager-f5df8899c-f4z5p" (UID: "79de344d-db63-45a0-8494-37dbb087b274") : configmap "config" not found Mar 19 11:53:22.671314 master-0 kubenswrapper[6932]: I0319 11:53:22.671254 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7crf\" (UniqueName: \"kubernetes.io/projected/79de344d-db63-45a0-8494-37dbb087b274-kube-api-access-f7crf\") pod \"controller-manager-f5df8899c-f4z5p\" (UID: \"79de344d-db63-45a0-8494-37dbb087b274\") " pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" Mar 19 11:53:23.144177 master-0 kubenswrapper[6932]: I0319 11:53:23.144106 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-config\") pod \"controller-manager-f5df8899c-f4z5p\" (UID: \"79de344d-db63-45a0-8494-37dbb087b274\") " pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" Mar 19 11:53:23.144421 master-0 kubenswrapper[6932]: I0319 11:53:23.144381 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-proxy-ca-bundles\") pod \"controller-manager-f5df8899c-f4z5p\" (UID: \"79de344d-db63-45a0-8494-37dbb087b274\") " pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" Mar 19 11:53:23.144480 master-0 kubenswrapper[6932]: I0319 11:53:23.144439 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79de344d-db63-45a0-8494-37dbb087b274-serving-cert\") pod \"controller-manager-f5df8899c-f4z5p\" (UID: \"79de344d-db63-45a0-8494-37dbb087b274\") " pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" Mar 19 11:53:23.144565 master-0 kubenswrapper[6932]: E0319 11:53:23.144540 6932 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Mar 19 11:53:23.144637 master-0 kubenswrapper[6932]: E0319 11:53:23.144616 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-proxy-ca-bundles podName:79de344d-db63-45a0-8494-37dbb087b274 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:24.144596097 +0000 UTC m=+28.503656319 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-proxy-ca-bundles") pod "controller-manager-f5df8899c-f4z5p" (UID: "79de344d-db63-45a0-8494-37dbb087b274") : configmap "openshift-global-ca" not found Mar 19 11:53:23.145157 master-0 kubenswrapper[6932]: E0319 11:53:23.144977 6932 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:53:23.145157 master-0 kubenswrapper[6932]: I0319 11:53:23.145091 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-client-ca\") pod \"controller-manager-f5df8899c-f4z5p\" (UID: \"79de344d-db63-45a0-8494-37dbb087b274\") " pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" Mar 19 11:53:23.145262 master-0 kubenswrapper[6932]: E0319 11:53:23.145106 6932 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 19 11:53:23.145262 master-0 kubenswrapper[6932]: I0319 11:53:23.145196 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-config\") pod \"controller-manager-f5df8899c-f4z5p\" (UID: \"79de344d-db63-45a0-8494-37dbb087b274\") " pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" Mar 19 11:53:23.145262 master-0 kubenswrapper[6932]: E0319 11:53:23.145137 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79de344d-db63-45a0-8494-37dbb087b274-serving-cert podName:79de344d-db63-45a0-8494-37dbb087b274 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:24.145113669 +0000 UTC m=+28.504173881 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/79de344d-db63-45a0-8494-37dbb087b274-serving-cert") pod "controller-manager-f5df8899c-f4z5p" (UID: "79de344d-db63-45a0-8494-37dbb087b274") : secret "serving-cert" not found Mar 19 11:53:23.145262 master-0 kubenswrapper[6932]: E0319 11:53:23.145242 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-client-ca podName:79de344d-db63-45a0-8494-37dbb087b274 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:24.145231901 +0000 UTC m=+28.504292123 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-client-ca") pod "controller-manager-f5df8899c-f4z5p" (UID: "79de344d-db63-45a0-8494-37dbb087b274") : configmap "client-ca" not found Mar 19 11:53:23.259213 master-0 kubenswrapper[6932]: I0319 11:53:23.258656 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f5df8899c-f4z5p"] Mar 19 11:53:23.259591 master-0 kubenswrapper[6932]: E0319 11:53:23.259544 6932 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" podUID="79de344d-db63-45a0-8494-37dbb087b274" Mar 19 11:53:23.263332 master-0 kubenswrapper[6932]: I0319 11:53:23.263287 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-564c64dc4c-zm89v"] Mar 19 11:53:23.263903 master-0 kubenswrapper[6932]: I0319 11:53:23.263882 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-564c64dc4c-zm89v" Mar 19 11:53:23.266572 master-0 kubenswrapper[6932]: I0319 11:53:23.266493 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 11:53:23.267648 master-0 kubenswrapper[6932]: I0319 11:53:23.267601 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 11:53:23.267648 master-0 kubenswrapper[6932]: I0319 11:53:23.267623 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 11:53:23.268150 master-0 kubenswrapper[6932]: I0319 11:53:23.268123 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 11:53:23.268340 master-0 kubenswrapper[6932]: I0319 11:53:23.268304 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 11:53:23.282408 master-0 kubenswrapper[6932]: I0319 11:53:23.282357 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-564c64dc4c-zm89v"] Mar 19 11:53:23.348190 master-0 kubenswrapper[6932]: I0319 11:53:23.348122 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-client-ca\") pod \"route-controller-manager-564c64dc4c-zm89v\" (UID: \"1a7cb8c3-2773-4d00-95ef-9e292323d3f7\") " pod="openshift-route-controller-manager/route-controller-manager-564c64dc4c-zm89v" Mar 19 11:53:23.348190 master-0 kubenswrapper[6932]: I0319 11:53:23.348180 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-serving-cert\") pod \"route-controller-manager-564c64dc4c-zm89v\" (UID: \"1a7cb8c3-2773-4d00-95ef-9e292323d3f7\") " pod="openshift-route-controller-manager/route-controller-manager-564c64dc4c-zm89v" Mar 19 11:53:23.348543 master-0 kubenswrapper[6932]: I0319 11:53:23.348490 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-config\") pod \"route-controller-manager-564c64dc4c-zm89v\" (UID: \"1a7cb8c3-2773-4d00-95ef-9e292323d3f7\") " pod="openshift-route-controller-manager/route-controller-manager-564c64dc4c-zm89v" Mar 19 11:53:23.348636 master-0 kubenswrapper[6932]: I0319 11:53:23.348598 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k2lt\" (UniqueName: \"kubernetes.io/projected/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-kube-api-access-4k2lt\") pod \"route-controller-manager-564c64dc4c-zm89v\" (UID: \"1a7cb8c3-2773-4d00-95ef-9e292323d3f7\") " pod="openshift-route-controller-manager/route-controller-manager-564c64dc4c-zm89v" Mar 19 11:53:23.449520 master-0 kubenswrapper[6932]: I0319 11:53:23.449470 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-client-ca\") pod \"route-controller-manager-564c64dc4c-zm89v\" (UID: \"1a7cb8c3-2773-4d00-95ef-9e292323d3f7\") " pod="openshift-route-controller-manager/route-controller-manager-564c64dc4c-zm89v" Mar 19 11:53:23.450131 master-0 kubenswrapper[6932]: I0319 11:53:23.449528 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-serving-cert\") pod \"route-controller-manager-564c64dc4c-zm89v\" (UID: \"1a7cb8c3-2773-4d00-95ef-9e292323d3f7\") " pod="openshift-route-controller-manager/route-controller-manager-564c64dc4c-zm89v" Mar 19 11:53:23.450131 master-0 kubenswrapper[6932]: I0319 11:53:23.449805 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-config\") pod \"route-controller-manager-564c64dc4c-zm89v\" (UID: \"1a7cb8c3-2773-4d00-95ef-9e292323d3f7\") " pod="openshift-route-controller-manager/route-controller-manager-564c64dc4c-zm89v" Mar 19 11:53:23.450131 master-0 kubenswrapper[6932]: E0319 11:53:23.449930 6932 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:53:23.450131 master-0 kubenswrapper[6932]: E0319 11:53:23.449991 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-serving-cert podName:1a7cb8c3-2773-4d00-95ef-9e292323d3f7 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:23.949973523 +0000 UTC m=+28.309033735 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-serving-cert") pod "route-controller-manager-564c64dc4c-zm89v" (UID: "1a7cb8c3-2773-4d00-95ef-9e292323d3f7") : secret "serving-cert" not found Mar 19 11:53:23.450363 master-0 kubenswrapper[6932]: I0319 11:53:23.450273 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k2lt\" (UniqueName: \"kubernetes.io/projected/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-kube-api-access-4k2lt\") pod \"route-controller-manager-564c64dc4c-zm89v\" (UID: \"1a7cb8c3-2773-4d00-95ef-9e292323d3f7\") " pod="openshift-route-controller-manager/route-controller-manager-564c64dc4c-zm89v" Mar 19 11:53:23.450703 master-0 kubenswrapper[6932]: I0319 11:53:23.450647 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-client-ca\") pod \"route-controller-manager-564c64dc4c-zm89v\" (UID: \"1a7cb8c3-2773-4d00-95ef-9e292323d3f7\") " pod="openshift-route-controller-manager/route-controller-manager-564c64dc4c-zm89v" Mar 19 11:53:23.451012 master-0 kubenswrapper[6932]: I0319 11:53:23.450978 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-config\") pod \"route-controller-manager-564c64dc4c-zm89v\" (UID: \"1a7cb8c3-2773-4d00-95ef-9e292323d3f7\") " pod="openshift-route-controller-manager/route-controller-manager-564c64dc4c-zm89v" Mar 19 11:53:23.470493 master-0 kubenswrapper[6932]: I0319 11:53:23.470437 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k2lt\" (UniqueName: \"kubernetes.io/projected/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-kube-api-access-4k2lt\") pod \"route-controller-manager-564c64dc4c-zm89v\" (UID: \"1a7cb8c3-2773-4d00-95ef-9e292323d3f7\") " pod="openshift-route-controller-manager/route-controller-manager-564c64dc4c-zm89v" Mar 19 11:53:23.956618 master-0 kubenswrapper[6932]: I0319 11:53:23.956549 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-serving-cert\") pod \"route-controller-manager-564c64dc4c-zm89v\" (UID: \"1a7cb8c3-2773-4d00-95ef-9e292323d3f7\") " pod="openshift-route-controller-manager/route-controller-manager-564c64dc4c-zm89v" Mar 19 11:53:23.956869 master-0 kubenswrapper[6932]: E0319 11:53:23.956766 6932 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:53:23.956869 master-0 kubenswrapper[6932]: E0319 11:53:23.956860 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-serving-cert podName:1a7cb8c3-2773-4d00-95ef-9e292323d3f7 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:24.95684212 +0000 UTC m=+29.315902342 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-serving-cert") pod "route-controller-manager-564c64dc4c-zm89v" (UID: "1a7cb8c3-2773-4d00-95ef-9e292323d3f7") : secret "serving-cert" not found Mar 19 11:53:24.111063 master-0 kubenswrapper[6932]: I0319 11:53:24.110983 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" Mar 19 11:53:24.111516 master-0 kubenswrapper[6932]: I0319 11:53:24.110931 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-n52gc" event={"ID":"4e2c195f-e97d-4cac-81fc-2d5c551d1c30","Type":"ContainerStarted","Data":"11d6e31735d62ce503d8d3416258f3b1f9eed9d14d7523dff450f6c35d25830f"} Mar 19 11:53:24.123223 master-0 kubenswrapper[6932]: I0319 11:53:24.123174 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" Mar 19 11:53:24.158428 master-0 kubenswrapper[6932]: I0319 11:53:24.158372 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-client-ca\") pod \"controller-manager-f5df8899c-f4z5p\" (UID: \"79de344d-db63-45a0-8494-37dbb087b274\") " pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" Mar 19 11:53:24.158798 master-0 kubenswrapper[6932]: I0319 11:53:24.158489 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-proxy-ca-bundles\") pod \"controller-manager-f5df8899c-f4z5p\" (UID: \"79de344d-db63-45a0-8494-37dbb087b274\") " pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" Mar 19 11:53:24.158798 master-0 kubenswrapper[6932]: I0319 11:53:24.158524 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79de344d-db63-45a0-8494-37dbb087b274-serving-cert\") pod \"controller-manager-f5df8899c-f4z5p\" (UID: \"79de344d-db63-45a0-8494-37dbb087b274\") " pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" Mar 19 11:53:24.159016 master-0 kubenswrapper[6932]: E0319 11:53:24.158991 6932 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:53:24.159197 master-0 kubenswrapper[6932]: E0319 11:53:24.159104 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79de344d-db63-45a0-8494-37dbb087b274-serving-cert podName:79de344d-db63-45a0-8494-37dbb087b274 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:26.159086847 +0000 UTC m=+30.518147119 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/79de344d-db63-45a0-8494-37dbb087b274-serving-cert") pod "controller-manager-f5df8899c-f4z5p" (UID: "79de344d-db63-45a0-8494-37dbb087b274") : secret "serving-cert" not found Mar 19 11:53:24.178946 master-0 kubenswrapper[6932]: I0319 11:53:24.178875 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-proxy-ca-bundles\") pod \"controller-manager-f5df8899c-f4z5p\" (UID: \"79de344d-db63-45a0-8494-37dbb087b274\") " pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" Mar 19 11:53:24.179547 master-0 kubenswrapper[6932]: I0319 11:53:24.179513 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-client-ca\") pod \"controller-manager-f5df8899c-f4z5p\" (UID: \"79de344d-db63-45a0-8494-37dbb087b274\") " pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" Mar 19 11:53:24.261328 master-0 kubenswrapper[6932]: I0319 11:53:24.261212 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7crf\" (UniqueName: \"kubernetes.io/projected/79de344d-db63-45a0-8494-37dbb087b274-kube-api-access-f7crf\") pod \"79de344d-db63-45a0-8494-37dbb087b274\" (UID: \"79de344d-db63-45a0-8494-37dbb087b274\") " Mar 19 11:53:24.261538 master-0 kubenswrapper[6932]: I0319 11:53:24.261391 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-proxy-ca-bundles\") pod \"79de344d-db63-45a0-8494-37dbb087b274\" (UID: \"79de344d-db63-45a0-8494-37dbb087b274\") " Mar 19 11:53:24.261538 master-0 kubenswrapper[6932]: I0319 11:53:24.261454 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-config\") pod \"79de344d-db63-45a0-8494-37dbb087b274\" (UID: \"79de344d-db63-45a0-8494-37dbb087b274\") " Mar 19 11:53:24.261538 master-0 kubenswrapper[6932]: I0319 11:53:24.261485 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-client-ca\") pod \"79de344d-db63-45a0-8494-37dbb087b274\" (UID: \"79de344d-db63-45a0-8494-37dbb087b274\") " Mar 19 11:53:24.262581 master-0 kubenswrapper[6932]: I0319 11:53:24.262497 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-config" (OuterVolumeSpecName: "config") pod "79de344d-db63-45a0-8494-37dbb087b274" (UID: "79de344d-db63-45a0-8494-37dbb087b274"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:53:24.262675 master-0 kubenswrapper[6932]: I0319 11:53:24.262561 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-client-ca" (OuterVolumeSpecName: "client-ca") pod "79de344d-db63-45a0-8494-37dbb087b274" (UID: "79de344d-db63-45a0-8494-37dbb087b274"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:53:24.262782 master-0 kubenswrapper[6932]: I0319 11:53:24.262666 6932 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:24.263130 master-0 kubenswrapper[6932]: I0319 11:53:24.263073 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "79de344d-db63-45a0-8494-37dbb087b274" (UID: "79de344d-db63-45a0-8494-37dbb087b274"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:53:24.273015 master-0 kubenswrapper[6932]: I0319 11:53:24.272948 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79de344d-db63-45a0-8494-37dbb087b274-kube-api-access-f7crf" (OuterVolumeSpecName: "kube-api-access-f7crf") pod "79de344d-db63-45a0-8494-37dbb087b274" (UID: "79de344d-db63-45a0-8494-37dbb087b274"). InnerVolumeSpecName "kube-api-access-f7crf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:53:24.364777 master-0 kubenswrapper[6932]: I0319 11:53:24.364703 6932 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7crf\" (UniqueName: \"kubernetes.io/projected/79de344d-db63-45a0-8494-37dbb087b274-kube-api-access-f7crf\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:24.364777 master-0 kubenswrapper[6932]: I0319 11:53:24.364774 6932 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:24.364999 master-0 kubenswrapper[6932]: I0319 11:53:24.364794 6932 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79de344d-db63-45a0-8494-37dbb087b274-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:24.971743 master-0 kubenswrapper[6932]: I0319 11:53:24.971642 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-serving-cert\") pod \"route-controller-manager-564c64dc4c-zm89v\" (UID: \"1a7cb8c3-2773-4d00-95ef-9e292323d3f7\") " pod="openshift-route-controller-manager/route-controller-manager-564c64dc4c-zm89v" Mar 19 11:53:24.972763 master-0 kubenswrapper[6932]: E0319 11:53:24.971841 6932 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:53:24.972763 master-0 kubenswrapper[6932]: E0319 11:53:24.971943 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-serving-cert podName:1a7cb8c3-2773-4d00-95ef-9e292323d3f7 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:26.971922353 +0000 UTC m=+31.330982645 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-serving-cert") pod "route-controller-manager-564c64dc4c-zm89v" (UID: "1a7cb8c3-2773-4d00-95ef-9e292323d3f7") : secret "serving-cert" not found Mar 19 11:53:25.046793 master-0 kubenswrapper[6932]: I0319 11:53:25.046751 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-564c64dc4c-zm89v"] Mar 19 11:53:25.047267 master-0 kubenswrapper[6932]: E0319 11:53:25.047241 6932 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-564c64dc4c-zm89v" podUID="1a7cb8c3-2773-4d00-95ef-9e292323d3f7" Mar 19 11:53:25.114151 master-0 kubenswrapper[6932]: I0319 11:53:25.114117 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-564c64dc4c-zm89v" Mar 19 11:53:25.114402 master-0 kubenswrapper[6932]: I0319 11:53:25.114355 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f5df8899c-f4z5p" Mar 19 11:53:25.122436 master-0 kubenswrapper[6932]: I0319 11:53:25.122409 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-564c64dc4c-zm89v" Mar 19 11:53:25.152508 master-0 kubenswrapper[6932]: I0319 11:53:25.152452 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f5df8899c-f4z5p"] Mar 19 11:53:25.159150 master-0 kubenswrapper[6932]: I0319 11:53:25.159068 6932 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-f5df8899c-f4z5p"] Mar 19 11:53:25.275520 master-0 kubenswrapper[6932]: I0319 11:53:25.275408 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-client-ca\") pod \"1a7cb8c3-2773-4d00-95ef-9e292323d3f7\" (UID: \"1a7cb8c3-2773-4d00-95ef-9e292323d3f7\") " Mar 19 11:53:25.275520 master-0 kubenswrapper[6932]: I0319 11:53:25.275467 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4k2lt\" (UniqueName: \"kubernetes.io/projected/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-kube-api-access-4k2lt\") pod \"1a7cb8c3-2773-4d00-95ef-9e292323d3f7\" (UID: \"1a7cb8c3-2773-4d00-95ef-9e292323d3f7\") " Mar 19 11:53:25.275520 master-0 kubenswrapper[6932]: I0319 11:53:25.275504 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-config\") pod \"1a7cb8c3-2773-4d00-95ef-9e292323d3f7\" (UID: \"1a7cb8c3-2773-4d00-95ef-9e292323d3f7\") " Mar 19 11:53:25.276071 master-0 kubenswrapper[6932]: I0319 11:53:25.276042 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-config" (OuterVolumeSpecName: "config") pod "1a7cb8c3-2773-4d00-95ef-9e292323d3f7" (UID: "1a7cb8c3-2773-4d00-95ef-9e292323d3f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:53:25.276174 master-0 kubenswrapper[6932]: I0319 11:53:25.276150 6932 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79de344d-db63-45a0-8494-37dbb087b274-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:25.276210 master-0 kubenswrapper[6932]: I0319 11:53:25.276178 6932 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:25.276261 master-0 kubenswrapper[6932]: I0319 11:53:25.276212 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-client-ca" (OuterVolumeSpecName: "client-ca") pod "1a7cb8c3-2773-4d00-95ef-9e292323d3f7" (UID: "1a7cb8c3-2773-4d00-95ef-9e292323d3f7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:53:25.284332 master-0 kubenswrapper[6932]: I0319 11:53:25.284288 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-kube-api-access-4k2lt" (OuterVolumeSpecName: "kube-api-access-4k2lt") pod "1a7cb8c3-2773-4d00-95ef-9e292323d3f7" (UID: "1a7cb8c3-2773-4d00-95ef-9e292323d3f7"). InnerVolumeSpecName "kube-api-access-4k2lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:53:25.377680 master-0 kubenswrapper[6932]: I0319 11:53:25.377620 6932 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:25.377680 master-0 kubenswrapper[6932]: I0319 11:53:25.377660 6932 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4k2lt\" (UniqueName: \"kubernetes.io/projected/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-kube-api-access-4k2lt\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:25.874854 master-0 kubenswrapper[6932]: I0319 11:53:25.874815 6932 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79de344d-db63-45a0-8494-37dbb087b274" path="/var/lib/kubelet/pods/79de344d-db63-45a0-8494-37dbb087b274/volumes" Mar 19 11:53:26.116492 master-0 kubenswrapper[6932]: I0319 11:53:26.116388 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-564c64dc4c-zm89v" Mar 19 11:53:26.138302 master-0 kubenswrapper[6932]: I0319 11:53:26.138244 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-564c64dc4c-zm89v"] Mar 19 11:53:26.141245 master-0 kubenswrapper[6932]: I0319 11:53:26.141206 6932 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-564c64dc4c-zm89v"] Mar 19 11:53:26.300052 master-0 kubenswrapper[6932]: I0319 11:53:26.299957 6932 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a7cb8c3-2773-4d00-95ef-9e292323d3f7-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:27.482079 master-0 kubenswrapper[6932]: I0319 11:53:27.481648 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6d4875b77f-58xfr"] Mar 19 11:53:27.482933 master-0 kubenswrapper[6932]: I0319 11:53:27.482517 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2"] Mar 19 11:53:27.482981 master-0 kubenswrapper[6932]: I0319 11:53:27.482939 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2" Mar 19 11:53:27.483364 master-0 kubenswrapper[6932]: I0319 11:53:27.483337 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d4875b77f-58xfr" Mar 19 11:53:27.492253 master-0 kubenswrapper[6932]: I0319 11:53:27.492212 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 11:53:27.492440 master-0 kubenswrapper[6932]: I0319 11:53:27.492284 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 11:53:27.492440 master-0 kubenswrapper[6932]: I0319 11:53:27.492214 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 11:53:27.492440 master-0 kubenswrapper[6932]: I0319 11:53:27.492360 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 11:53:27.492567 master-0 kubenswrapper[6932]: I0319 11:53:27.492430 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 11:53:27.492567 master-0 kubenswrapper[6932]: I0319 11:53:27.492431 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 11:53:27.492567 master-0 kubenswrapper[6932]: I0319 11:53:27.492551 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 11:53:27.492682 master-0 kubenswrapper[6932]: I0319 11:53:27.492643 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 11:53:27.492844 master-0 kubenswrapper[6932]: I0319 11:53:27.492816 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 11:53:27.493550 master-0 kubenswrapper[6932]: I0319 11:53:27.493512 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 11:53:27.496814 master-0 kubenswrapper[6932]: I0319 11:53:27.496755 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 11:53:27.528256 master-0 kubenswrapper[6932]: I0319 11:53:27.525007 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d4875b77f-58xfr"] Mar 19 11:53:27.528256 master-0 kubenswrapper[6932]: I0319 11:53:27.525061 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2"] Mar 19 11:53:27.626752 master-0 kubenswrapper[6932]: I0319 11:53:27.626147 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2bq5\" (UniqueName: \"kubernetes.io/projected/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-kube-api-access-z2bq5\") pod \"route-controller-manager-9f85cf6f7-6kjz2\" (UID: \"abc21a83-e7d5-406f-a2b9-be189b0ef9a5\") " pod="openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2" Mar 19 11:53:27.626752 master-0 kubenswrapper[6932]: I0319 11:53:27.626229 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12ec81c5-bbfd-414b-8b1f-c814fcda5791-client-ca\") pod \"controller-manager-6d4875b77f-58xfr\" (UID: \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\") " pod="openshift-controller-manager/controller-manager-6d4875b77f-58xfr" Mar 19 11:53:27.626752 master-0 kubenswrapper[6932]: I0319 11:53:27.626256 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12ec81c5-bbfd-414b-8b1f-c814fcda5791-proxy-ca-bundles\") pod \"controller-manager-6d4875b77f-58xfr\" (UID: \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\") " pod="openshift-controller-manager/controller-manager-6d4875b77f-58xfr" Mar 19 11:53:27.626752 master-0 kubenswrapper[6932]: I0319 11:53:27.626343 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j7ll\" (UniqueName: \"kubernetes.io/projected/12ec81c5-bbfd-414b-8b1f-c814fcda5791-kube-api-access-6j7ll\") pod \"controller-manager-6d4875b77f-58xfr\" (UID: \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\") " pod="openshift-controller-manager/controller-manager-6d4875b77f-58xfr" Mar 19 11:53:27.626752 master-0 kubenswrapper[6932]: I0319 11:53:27.626369 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-serving-cert\") pod \"route-controller-manager-9f85cf6f7-6kjz2\" (UID: \"abc21a83-e7d5-406f-a2b9-be189b0ef9a5\") " pod="openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2" Mar 19 11:53:27.626752 master-0 kubenswrapper[6932]: I0319 11:53:27.626405 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-config\") pod \"route-controller-manager-9f85cf6f7-6kjz2\" (UID: \"abc21a83-e7d5-406f-a2b9-be189b0ef9a5\") " pod="openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2" Mar 19 11:53:27.626752 master-0 kubenswrapper[6932]: I0319 11:53:27.626425 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12ec81c5-bbfd-414b-8b1f-c814fcda5791-config\") pod \"controller-manager-6d4875b77f-58xfr\" (UID: \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\") " pod="openshift-controller-manager/controller-manager-6d4875b77f-58xfr" Mar 19 11:53:27.626752 master-0 kubenswrapper[6932]: I0319 11:53:27.626463 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-client-ca\") pod \"route-controller-manager-9f85cf6f7-6kjz2\" (UID: \"abc21a83-e7d5-406f-a2b9-be189b0ef9a5\") " pod="openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2" Mar 19 11:53:27.626752 master-0 kubenswrapper[6932]: I0319 11:53:27.626484 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12ec81c5-bbfd-414b-8b1f-c814fcda5791-serving-cert\") pod \"controller-manager-6d4875b77f-58xfr\" (UID: \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\") " pod="openshift-controller-manager/controller-manager-6d4875b77f-58xfr" Mar 19 11:53:27.728758 master-0 kubenswrapper[6932]: I0319 11:53:27.728359 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j7ll\" (UniqueName: \"kubernetes.io/projected/12ec81c5-bbfd-414b-8b1f-c814fcda5791-kube-api-access-6j7ll\") pod \"controller-manager-6d4875b77f-58xfr\" (UID: \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\") " pod="openshift-controller-manager/controller-manager-6d4875b77f-58xfr" Mar 19 11:53:27.728758 master-0 kubenswrapper[6932]: I0319 11:53:27.728433 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-serving-cert\") pod \"route-controller-manager-9f85cf6f7-6kjz2\" (UID: \"abc21a83-e7d5-406f-a2b9-be189b0ef9a5\") " pod="openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2" Mar 19 11:53:27.728758 master-0 kubenswrapper[6932]: I0319 11:53:27.728479 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-config\") pod \"route-controller-manager-9f85cf6f7-6kjz2\" (UID: \"abc21a83-e7d5-406f-a2b9-be189b0ef9a5\") " pod="openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2" Mar 19 11:53:27.728758 master-0 kubenswrapper[6932]: I0319 11:53:27.728503 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12ec81c5-bbfd-414b-8b1f-c814fcda5791-config\") pod \"controller-manager-6d4875b77f-58xfr\" (UID: \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\") " pod="openshift-controller-manager/controller-manager-6d4875b77f-58xfr" Mar 19 11:53:27.728758 master-0 kubenswrapper[6932]: I0319 11:53:27.728565 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-client-ca\") pod \"route-controller-manager-9f85cf6f7-6kjz2\" (UID: \"abc21a83-e7d5-406f-a2b9-be189b0ef9a5\") " pod="openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2" Mar 19 11:53:27.728758 master-0 kubenswrapper[6932]: I0319 11:53:27.728602 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12ec81c5-bbfd-414b-8b1f-c814fcda5791-serving-cert\") pod \"controller-manager-6d4875b77f-58xfr\" (UID: \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\") " pod="openshift-controller-manager/controller-manager-6d4875b77f-58xfr" Mar 19 11:53:27.728758 master-0 kubenswrapper[6932]: I0319 11:53:27.728646 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2bq5\" (UniqueName: \"kubernetes.io/projected/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-kube-api-access-z2bq5\") pod \"route-controller-manager-9f85cf6f7-6kjz2\" (UID: \"abc21a83-e7d5-406f-a2b9-be189b0ef9a5\") " pod="openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2" Mar 19 11:53:27.728758 master-0 kubenswrapper[6932]: I0319 11:53:27.728675 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12ec81c5-bbfd-414b-8b1f-c814fcda5791-client-ca\") pod \"controller-manager-6d4875b77f-58xfr\" (UID: \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\") " pod="openshift-controller-manager/controller-manager-6d4875b77f-58xfr" Mar 19 11:53:27.728758 master-0 kubenswrapper[6932]: I0319 11:53:27.728705 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12ec81c5-bbfd-414b-8b1f-c814fcda5791-proxy-ca-bundles\") pod \"controller-manager-6d4875b77f-58xfr\" (UID: \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\") " pod="openshift-controller-manager/controller-manager-6d4875b77f-58xfr" Mar 19 11:53:27.732814 master-0 kubenswrapper[6932]: E0319 11:53:27.730671 6932 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:53:27.732814 master-0 kubenswrapper[6932]: E0319 11:53:27.730757 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12ec81c5-bbfd-414b-8b1f-c814fcda5791-serving-cert podName:12ec81c5-bbfd-414b-8b1f-c814fcda5791 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:28.230716191 +0000 UTC m=+32.589776403 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/12ec81c5-bbfd-414b-8b1f-c814fcda5791-serving-cert") pod "controller-manager-6d4875b77f-58xfr" (UID: "12ec81c5-bbfd-414b-8b1f-c814fcda5791") : secret "serving-cert" not found Mar 19 11:53:27.732814 master-0 kubenswrapper[6932]: E0319 11:53:27.730910 6932 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:53:27.732814 master-0 kubenswrapper[6932]: E0319 11:53:27.731019 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-serving-cert podName:abc21a83-e7d5-406f-a2b9-be189b0ef9a5 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:28.230990447 +0000 UTC m=+32.590050719 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-serving-cert") pod "route-controller-manager-9f85cf6f7-6kjz2" (UID: "abc21a83-e7d5-406f-a2b9-be189b0ef9a5") : secret "serving-cert" not found Mar 19 11:53:27.732814 master-0 kubenswrapper[6932]: I0319 11:53:27.732362 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12ec81c5-bbfd-414b-8b1f-c814fcda5791-client-ca\") pod \"controller-manager-6d4875b77f-58xfr\" (UID: \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\") " pod="openshift-controller-manager/controller-manager-6d4875b77f-58xfr" Mar 19 11:53:27.732814 master-0 kubenswrapper[6932]: I0319 11:53:27.732680 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12ec81c5-bbfd-414b-8b1f-c814fcda5791-proxy-ca-bundles\") pod \"controller-manager-6d4875b77f-58xfr\" (UID: \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\") " pod="openshift-controller-manager/controller-manager-6d4875b77f-58xfr" Mar 19 11:53:27.733161 master-0 kubenswrapper[6932]: I0319 11:53:27.732864 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12ec81c5-bbfd-414b-8b1f-c814fcda5791-config\") pod \"controller-manager-6d4875b77f-58xfr\" (UID: \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\") " pod="openshift-controller-manager/controller-manager-6d4875b77f-58xfr" Mar 19 11:53:27.736809 master-0 kubenswrapper[6932]: I0319 11:53:27.734661 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-config\") pod \"route-controller-manager-9f85cf6f7-6kjz2\" (UID: \"abc21a83-e7d5-406f-a2b9-be189b0ef9a5\") " pod="openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2" Mar 19 11:53:27.736809 master-0 kubenswrapper[6932]: I0319 11:53:27.735012 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-client-ca\") pod \"route-controller-manager-9f85cf6f7-6kjz2\" (UID: \"abc21a83-e7d5-406f-a2b9-be189b0ef9a5\") " pod="openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2" Mar 19 11:53:27.773237 master-0 kubenswrapper[6932]: I0319 11:53:27.773173 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j7ll\" (UniqueName: \"kubernetes.io/projected/12ec81c5-bbfd-414b-8b1f-c814fcda5791-kube-api-access-6j7ll\") pod \"controller-manager-6d4875b77f-58xfr\" (UID: \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\") " pod="openshift-controller-manager/controller-manager-6d4875b77f-58xfr" Mar 19 11:53:27.780518 master-0 kubenswrapper[6932]: I0319 11:53:27.780475 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2bq5\" (UniqueName: \"kubernetes.io/projected/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-kube-api-access-z2bq5\") pod \"route-controller-manager-9f85cf6f7-6kjz2\" (UID: \"abc21a83-e7d5-406f-a2b9-be189b0ef9a5\") " pod="openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2" Mar 19 11:53:27.875500 master-0 kubenswrapper[6932]: I0319 11:53:27.875434 6932 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a7cb8c3-2773-4d00-95ef-9e292323d3f7" path="/var/lib/kubelet/pods/1a7cb8c3-2773-4d00-95ef-9e292323d3f7/volumes" Mar 19 11:53:28.235237 master-0 kubenswrapper[6932]: I0319 11:53:28.235162 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-serving-cert\") pod \"route-controller-manager-9f85cf6f7-6kjz2\" (UID: \"abc21a83-e7d5-406f-a2b9-be189b0ef9a5\") " pod="openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2" Mar 19 11:53:28.235469 master-0 kubenswrapper[6932]: I0319 11:53:28.235410 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12ec81c5-bbfd-414b-8b1f-c814fcda5791-serving-cert\") pod \"controller-manager-6d4875b77f-58xfr\" (UID: \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\") " pod="openshift-controller-manager/controller-manager-6d4875b77f-58xfr" Mar 19 11:53:28.235556 master-0 kubenswrapper[6932]: E0319 11:53:28.235491 6932 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:53:28.235598 master-0 kubenswrapper[6932]: E0319 11:53:28.235585 6932 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:53:28.235649 master-0 kubenswrapper[6932]: E0319 11:53:28.235637 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-serving-cert podName:abc21a83-e7d5-406f-a2b9-be189b0ef9a5 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:29.235601114 +0000 UTC m=+33.594661336 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-serving-cert") pod "route-controller-manager-9f85cf6f7-6kjz2" (UID: "abc21a83-e7d5-406f-a2b9-be189b0ef9a5") : secret "serving-cert" not found Mar 19 11:53:28.235706 master-0 kubenswrapper[6932]: E0319 11:53:28.235665 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12ec81c5-bbfd-414b-8b1f-c814fcda5791-serving-cert podName:12ec81c5-bbfd-414b-8b1f-c814fcda5791 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:29.235655845 +0000 UTC m=+33.594716067 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/12ec81c5-bbfd-414b-8b1f-c814fcda5791-serving-cert") pod "controller-manager-6d4875b77f-58xfr" (UID: "12ec81c5-bbfd-414b-8b1f-c814fcda5791") : secret "serving-cert" not found Mar 19 11:53:29.246189 master-0 kubenswrapper[6932]: I0319 11:53:29.246119 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-serving-cert\") pod \"route-controller-manager-9f85cf6f7-6kjz2\" (UID: \"abc21a83-e7d5-406f-a2b9-be189b0ef9a5\") " pod="openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2" Mar 19 11:53:29.246924 master-0 kubenswrapper[6932]: I0319 11:53:29.246242 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12ec81c5-bbfd-414b-8b1f-c814fcda5791-serving-cert\") pod \"controller-manager-6d4875b77f-58xfr\" (UID: \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\") " pod="openshift-controller-manager/controller-manager-6d4875b77f-58xfr" Mar 19 11:53:29.246924 master-0 kubenswrapper[6932]: E0319 11:53:29.246386 6932 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:53:29.246924 master-0 kubenswrapper[6932]: E0319 11:53:29.246431 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12ec81c5-bbfd-414b-8b1f-c814fcda5791-serving-cert podName:12ec81c5-bbfd-414b-8b1f-c814fcda5791 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:31.24641666 +0000 UTC m=+35.605476882 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/12ec81c5-bbfd-414b-8b1f-c814fcda5791-serving-cert") pod "controller-manager-6d4875b77f-58xfr" (UID: "12ec81c5-bbfd-414b-8b1f-c814fcda5791") : secret "serving-cert" not found Mar 19 11:53:29.246924 master-0 kubenswrapper[6932]: E0319 11:53:29.246481 6932 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:53:29.246924 master-0 kubenswrapper[6932]: E0319 11:53:29.246507 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-serving-cert podName:abc21a83-e7d5-406f-a2b9-be189b0ef9a5 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:31.246500833 +0000 UTC m=+35.605561055 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-serving-cert") pod "route-controller-manager-9f85cf6f7-6kjz2" (UID: "abc21a83-e7d5-406f-a2b9-be189b0ef9a5") : secret "serving-cert" not found Mar 19 11:53:29.313774 master-0 kubenswrapper[6932]: I0319 11:53:29.313693 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-5848f5856-st8kj"] Mar 19 11:53:29.314432 master-0 kubenswrapper[6932]: I0319 11:53:29.314393 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.324230 master-0 kubenswrapper[6932]: I0319 11:53:29.322261 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 11:53:29.324230 master-0 kubenswrapper[6932]: I0319 11:53:29.322361 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 11:53:29.324230 master-0 kubenswrapper[6932]: I0319 11:53:29.322824 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 11:53:29.324230 master-0 kubenswrapper[6932]: I0319 11:53:29.322888 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-0" Mar 19 11:53:29.324230 master-0 kubenswrapper[6932]: I0319 11:53:29.322953 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 11:53:29.324230 master-0 kubenswrapper[6932]: I0319 11:53:29.322968 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-0" Mar 19 11:53:29.324230 master-0 kubenswrapper[6932]: I0319 11:53:29.323510 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 11:53:29.324230 master-0 kubenswrapper[6932]: I0319 11:53:29.323625 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 11:53:29.324230 master-0 kubenswrapper[6932]: I0319 11:53:29.323771 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 11:53:29.330096 master-0 kubenswrapper[6932]: I0319 11:53:29.330061 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 11:53:29.331805 master-0 kubenswrapper[6932]: I0319 11:53:29.331779 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-5848f5856-st8kj"] Mar 19 11:53:29.468174 master-0 kubenswrapper[6932]: I0319 11:53:29.468114 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e8048701-c79a-4112-9a61-33bf9fb01a62-audit-dir\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.468174 master-0 kubenswrapper[6932]: I0319 11:53:29.468161 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-serving-cert\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.468406 master-0 kubenswrapper[6932]: I0319 11:53:29.468207 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e8048701-c79a-4112-9a61-33bf9fb01a62-node-pullsecrets\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.468406 master-0 kubenswrapper[6932]: I0319 11:53:29.468226 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-image-import-ca\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.468406 master-0 kubenswrapper[6932]: I0319 11:53:29.468254 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-encryption-config\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.468406 master-0 kubenswrapper[6932]: I0319 11:53:29.468269 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wwqd\" (UniqueName: \"kubernetes.io/projected/e8048701-c79a-4112-9a61-33bf9fb01a62-kube-api-access-8wwqd\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.468406 master-0 kubenswrapper[6932]: I0319 11:53:29.468293 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-etcd-serving-ca\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.468406 master-0 kubenswrapper[6932]: I0319 11:53:29.468329 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-config\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.468406 master-0 kubenswrapper[6932]: I0319 11:53:29.468380 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-audit\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.468589 master-0 kubenswrapper[6932]: I0319 11:53:29.468421 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-trusted-ca-bundle\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.468589 master-0 kubenswrapper[6932]: I0319 11:53:29.468437 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-etcd-client\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.569254 master-0 kubenswrapper[6932]: I0319 11:53:29.569042 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-serving-cert\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.569254 master-0 kubenswrapper[6932]: I0319 11:53:29.569081 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e8048701-c79a-4112-9a61-33bf9fb01a62-audit-dir\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.569254 master-0 kubenswrapper[6932]: I0319 11:53:29.569122 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e8048701-c79a-4112-9a61-33bf9fb01a62-node-pullsecrets\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.569254 master-0 kubenswrapper[6932]: I0319 11:53:29.569166 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-image-import-ca\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.569254 master-0 kubenswrapper[6932]: I0319 11:53:29.569216 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-encryption-config\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.569254 master-0 kubenswrapper[6932]: I0319 11:53:29.569232 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wwqd\" (UniqueName: \"kubernetes.io/projected/e8048701-c79a-4112-9a61-33bf9fb01a62-kube-api-access-8wwqd\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.569254 master-0 kubenswrapper[6932]: E0319 11:53:29.569236 6932 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 19 11:53:29.569254 master-0 kubenswrapper[6932]: I0319 11:53:29.569255 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e8048701-c79a-4112-9a61-33bf9fb01a62-node-pullsecrets\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.569254 master-0 kubenswrapper[6932]: I0319 11:53:29.569171 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e8048701-c79a-4112-9a61-33bf9fb01a62-audit-dir\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.569753 master-0 kubenswrapper[6932]: E0319 11:53:29.569282 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-serving-cert podName:e8048701-c79a-4112-9a61-33bf9fb01a62 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:30.069269729 +0000 UTC m=+34.428329951 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-serving-cert") pod "apiserver-5848f5856-st8kj" (UID: "e8048701-c79a-4112-9a61-33bf9fb01a62") : secret "serving-cert" not found Mar 19 11:53:29.569753 master-0 kubenswrapper[6932]: I0319 11:53:29.569338 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-etcd-serving-ca\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.569753 master-0 kubenswrapper[6932]: I0319 11:53:29.569372 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-config\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.569753 master-0 kubenswrapper[6932]: I0319 11:53:29.569431 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-audit\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.570135 master-0 kubenswrapper[6932]: I0319 11:53:29.569967 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-trusted-ca-bundle\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.570135 master-0 kubenswrapper[6932]: I0319 11:53:29.569997 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-etcd-client\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.570135 master-0 kubenswrapper[6932]: I0319 11:53:29.570043 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-config\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.570135 master-0 kubenswrapper[6932]: E0319 11:53:29.570047 6932 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 19 11:53:29.570135 master-0 kubenswrapper[6932]: E0319 11:53:29.570089 6932 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 19 11:53:29.570135 master-0 kubenswrapper[6932]: E0319 11:53:29.570119 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-etcd-client podName:e8048701-c79a-4112-9a61-33bf9fb01a62 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:30.070108989 +0000 UTC m=+34.429169201 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-etcd-client") pod "apiserver-5848f5856-st8kj" (UID: "e8048701-c79a-4112-9a61-33bf9fb01a62") : secret "etcd-client" not found Mar 19 11:53:29.570380 master-0 kubenswrapper[6932]: E0319 11:53:29.570150 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-audit podName:e8048701-c79a-4112-9a61-33bf9fb01a62 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:30.070144069 +0000 UTC m=+34.429204291 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-audit") pod "apiserver-5848f5856-st8kj" (UID: "e8048701-c79a-4112-9a61-33bf9fb01a62") : configmap "audit-0" not found Mar 19 11:53:29.570380 master-0 kubenswrapper[6932]: I0319 11:53:29.570194 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-image-import-ca\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.570688 master-0 kubenswrapper[6932]: I0319 11:53:29.570654 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-etcd-serving-ca\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.570834 master-0 kubenswrapper[6932]: I0319 11:53:29.570798 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-trusted-ca-bundle\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.574137 master-0 kubenswrapper[6932]: I0319 11:53:29.574096 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-encryption-config\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.587382 master-0 kubenswrapper[6932]: I0319 11:53:29.587317 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wwqd\" (UniqueName: \"kubernetes.io/projected/e8048701-c79a-4112-9a61-33bf9fb01a62-kube-api-access-8wwqd\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:29.670904 master-0 kubenswrapper[6932]: I0319 11:53:29.670835 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:53:29.670904 master-0 kubenswrapper[6932]: I0319 11:53:29.670908 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:53:29.671273 master-0 kubenswrapper[6932]: I0319 11:53:29.670947 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-wdwkz\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:53:29.671273 master-0 kubenswrapper[6932]: I0319 11:53:29.670981 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:53:29.671273 master-0 kubenswrapper[6932]: I0319 11:53:29.671009 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:53:29.671273 master-0 kubenswrapper[6932]: I0319 11:53:29.671037 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:53:29.671273 master-0 kubenswrapper[6932]: I0319 11:53:29.671081 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:53:29.671273 master-0 kubenswrapper[6932]: E0319 11:53:29.671180 6932 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 11:53:29.671273 master-0 kubenswrapper[6932]: E0319 11:53:29.671234 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs podName:f29b11ce-60e0-46b3-8d28-eea3452513cd nodeName:}" failed. No retries permitted until 2026-03-19 11:54:01.671217212 +0000 UTC m=+66.030277434 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs") pod "network-metrics-daemon-f6wv7" (UID: "f29b11ce-60e0-46b3-8d28-eea3452513cd") : secret "metrics-daemon-secret" not found Mar 19 11:53:29.671966 master-0 kubenswrapper[6932]: E0319 11:53:29.671622 6932 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 11:53:29.671966 master-0 kubenswrapper[6932]: E0319 11:53:29.671658 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls podName:681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:01.671647361 +0000 UTC m=+66.030707583 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-tkcwh" (UID: "681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9") : secret "cluster-monitoring-operator-tls" not found Mar 19 11:53:29.671966 master-0 kubenswrapper[6932]: E0319 11:53:29.671717 6932 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 11:53:29.671966 master-0 kubenswrapper[6932]: E0319 11:53:29.671769 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs podName:89cf2ee8-3664-4502-b70c-b7e0a5e92cb7 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:01.671759223 +0000 UTC m=+66.030819445 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-wdwkz" (UID: "89cf2ee8-3664-4502-b70c-b7e0a5e92cb7") : secret "multus-admission-controller-secret" not found Mar 19 11:53:29.674476 master-0 kubenswrapper[6932]: I0319 11:53:29.674162 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:53:29.674645 master-0 kubenswrapper[6932]: I0319 11:53:29.674569 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:53:29.674645 master-0 kubenswrapper[6932]: I0319 11:53:29.674652 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:53:29.674915 master-0 kubenswrapper[6932]: I0319 11:53:29.674829 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert\") pod \"cluster-version-operator-56d8475767-pk574\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:53:29.772735 master-0 kubenswrapper[6932]: I0319 11:53:29.772658 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls\") pod \"dns-operator-9c5679d8f-965np\" (UID: \"22e10648-af7c-409e-b947-570e7d807e05\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:53:29.772999 master-0 kubenswrapper[6932]: I0319 11:53:29.772771 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert\") pod \"olm-operator-5c9796789-l9sw9\" (UID: \"716c2176-50f9-4c4f-af0e-4c7973457df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:53:29.772999 master-0 kubenswrapper[6932]: I0319 11:53:29.772851 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:53:29.773120 master-0 kubenswrapper[6932]: I0319 11:53:29.773043 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert\") pod \"catalog-operator-68f85b4d6c-n5gr9\" (UID: \"cf08ab4f-c203-4c16-9826-8cc049f4af31\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:53:29.773120 master-0 kubenswrapper[6932]: I0319 11:53:29.773089 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:53:29.773192 master-0 kubenswrapper[6932]: I0319 11:53:29.773159 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-jq5vq\" (UID: \"e5078f17-bc65-460f-9f18-8c506db6840b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:53:29.773323 master-0 kubenswrapper[6932]: E0319 11:53:29.773156 6932 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 11:53:29.773323 master-0 kubenswrapper[6932]: E0319 11:53:29.773305 6932 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 11:53:29.773835 master-0 kubenswrapper[6932]: E0319 11:53:29.773155 6932 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 11:53:29.773835 master-0 kubenswrapper[6932]: E0319 11:53:29.773183 6932 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 11:53:29.773835 master-0 kubenswrapper[6932]: E0319 11:53:29.773390 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics podName:b3de8a1b-a5be-414f-86e8-738e16c8bc97 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:01.773356848 +0000 UTC m=+66.132417070 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-bftt4" (UID: "b3de8a1b-a5be-414f-86e8-738e16c8bc97") : secret "marketplace-operator-metrics" not found Mar 19 11:53:29.773835 master-0 kubenswrapper[6932]: E0319 11:53:29.773773 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert podName:e5078f17-bc65-460f-9f18-8c506db6840b nodeName:}" failed. No retries permitted until 2026-03-19 11:54:01.773750167 +0000 UTC m=+66.132810389 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-jq5vq" (UID: "e5078f17-bc65-460f-9f18-8c506db6840b") : secret "package-server-manager-serving-cert" not found Mar 19 11:53:29.773835 master-0 kubenswrapper[6932]: E0319 11:53:29.773799 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert podName:716c2176-50f9-4c4f-af0e-4c7973457df2 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:01.773791718 +0000 UTC m=+66.132851940 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert") pod "olm-operator-5c9796789-l9sw9" (UID: "716c2176-50f9-4c4f-af0e-4c7973457df2") : secret "olm-operator-serving-cert" not found Mar 19 11:53:29.773835 master-0 kubenswrapper[6932]: E0319 11:53:29.773813 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert podName:cf08ab4f-c203-4c16-9826-8cc049f4af31 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:01.773806318 +0000 UTC m=+66.132866540 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert") pod "catalog-operator-68f85b4d6c-n5gr9" (UID: "cf08ab4f-c203-4c16-9826-8cc049f4af31") : secret "catalog-operator-serving-cert" not found Mar 19 11:53:29.782171 master-0 kubenswrapper[6932]: I0319 11:53:29.776416 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls\") pod \"dns-operator-9c5679d8f-965np\" (UID: \"22e10648-af7c-409e-b947-570e7d807e05\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:53:29.782171 master-0 kubenswrapper[6932]: I0319 11:53:29.776807 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:53:29.828872 master-0 kubenswrapper[6932]: I0319 11:53:29.828750 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:53:29.834639 master-0 kubenswrapper[6932]: I0319 11:53:29.829834 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:53:29.834639 master-0 kubenswrapper[6932]: I0319 11:53:29.830192 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:53:29.834639 master-0 kubenswrapper[6932]: I0319 11:53:29.832209 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:53:29.834639 master-0 kubenswrapper[6932]: I0319 11:53:29.832397 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:53:30.080066 master-0 kubenswrapper[6932]: I0319 11:53:30.079703 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-serving-cert\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:30.080066 master-0 kubenswrapper[6932]: I0319 11:53:30.080062 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-audit\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:30.080259 master-0 kubenswrapper[6932]: I0319 11:53:30.080103 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-etcd-client\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:30.080259 master-0 kubenswrapper[6932]: E0319 11:53:30.080226 6932 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 19 11:53:30.080325 master-0 kubenswrapper[6932]: E0319 11:53:30.080270 6932 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 19 11:53:30.080325 master-0 kubenswrapper[6932]: E0319 11:53:30.080282 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-etcd-client podName:e8048701-c79a-4112-9a61-33bf9fb01a62 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:31.080265128 +0000 UTC m=+35.439325350 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-etcd-client") pod "apiserver-5848f5856-st8kj" (UID: "e8048701-c79a-4112-9a61-33bf9fb01a62") : secret "etcd-client" not found Mar 19 11:53:30.080325 master-0 kubenswrapper[6932]: E0319 11:53:30.080324 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-audit podName:e8048701-c79a-4112-9a61-33bf9fb01a62 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:31.08031268 +0000 UTC m=+35.439372912 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-audit") pod "apiserver-5848f5856-st8kj" (UID: "e8048701-c79a-4112-9a61-33bf9fb01a62") : configmap "audit-0" not found Mar 19 11:53:30.080420 master-0 kubenswrapper[6932]: E0319 11:53:30.080336 6932 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 19 11:53:30.080447 master-0 kubenswrapper[6932]: E0319 11:53:30.080439 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-serving-cert podName:e8048701-c79a-4112-9a61-33bf9fb01a62 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:31.080418392 +0000 UTC m=+35.439478654 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-serving-cert") pod "apiserver-5848f5856-st8kj" (UID: "e8048701-c79a-4112-9a61-33bf9fb01a62") : secret "serving-cert" not found Mar 19 11:53:30.109974 master-0 kubenswrapper[6932]: I0319 11:53:30.108115 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd"] Mar 19 11:53:30.132846 master-0 kubenswrapper[6932]: I0319 11:53:30.132767 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" event={"ID":"aaaaf539-bf61-44d7-8d47-97535b7aa1ba","Type":"ContainerStarted","Data":"4f25a976585d22d9ce3955473a200e96837f45c766e321488b3d87050f023b7a"} Mar 19 11:53:30.142965 master-0 kubenswrapper[6932]: I0319 11:53:30.142835 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" event={"ID":"333047c4-aeca-410e-9393-ca4e74366921","Type":"ContainerStarted","Data":"51659f06b28a4c4f2cd28005c52835b309a9cf7c78a54c2ff2f7be93e57a3eb3"} Mar 19 11:53:30.270348 master-0 kubenswrapper[6932]: I0319 11:53:30.270285 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4"] Mar 19 11:53:30.277309 master-0 kubenswrapper[6932]: W0319 11:53:30.276393 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod163d6a3d_0080_4122_bb7a_17f6e63f00f0.slice/crio-8f6241ff25db322ca912b366aec02ce24e776e994e5454c053b2a00c5bd1a93b WatchSource:0}: Error finding container 8f6241ff25db322ca912b366aec02ce24e776e994e5454c053b2a00c5bd1a93b: Status 404 returned error can't find the container with id 8f6241ff25db322ca912b366aec02ce24e776e994e5454c053b2a00c5bd1a93b Mar 19 11:53:30.331588 master-0 kubenswrapper[6932]: I0319 11:53:30.331455 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-9c5679d8f-965np"] Mar 19 11:53:30.339240 master-0 kubenswrapper[6932]: W0319 11:53:30.338130 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22e10648_af7c_409e_b947_570e7d807e05.slice/crio-bca8253525b3cd943116e55714fdf37c6331867834b278964c5e6f5dd4c53fef WatchSource:0}: Error finding container bca8253525b3cd943116e55714fdf37c6331867834b278964c5e6f5dd4c53fef: Status 404 returned error can't find the container with id bca8253525b3cd943116e55714fdf37c6331867834b278964c5e6f5dd4c53fef Mar 19 11:53:30.339240 master-0 kubenswrapper[6932]: I0319 11:53:30.338217 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2"] Mar 19 11:53:30.790458 master-0 kubenswrapper[6932]: I0319 11:53:30.788050 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-5547669f67-9ltgx"] Mar 19 11:53:30.790955 master-0 kubenswrapper[6932]: I0319 11:53:30.790903 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:30.791886 master-0 kubenswrapper[6932]: I0319 11:53:30.791628 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-audit-dir\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:30.791886 master-0 kubenswrapper[6932]: I0319 11:53:30.791684 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt4fx\" (UniqueName: \"kubernetes.io/projected/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-kube-api-access-dt4fx\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:30.792482 master-0 kubenswrapper[6932]: I0319 11:53:30.792445 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 11:53:30.794924 master-0 kubenswrapper[6932]: I0319 11:53:30.793256 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-trusted-ca-bundle\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:30.794924 master-0 kubenswrapper[6932]: I0319 11:53:30.793923 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 11:53:30.794924 master-0 kubenswrapper[6932]: I0319 11:53:30.794226 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-etcd-client\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:30.794924 master-0 kubenswrapper[6932]: I0319 11:53:30.794296 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-encryption-config\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:30.794924 master-0 kubenswrapper[6932]: I0319 11:53:30.794384 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-etcd-serving-ca\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:30.794924 master-0 kubenswrapper[6932]: I0319 11:53:30.794530 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-audit-policies\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:30.804599 master-0 kubenswrapper[6932]: I0319 11:53:30.795924 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-serving-cert\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:30.804599 master-0 kubenswrapper[6932]: I0319 11:53:30.800231 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-5547669f67-9ltgx"] Mar 19 11:53:30.804599 master-0 kubenswrapper[6932]: I0319 11:53:30.800412 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 11:53:30.804599 master-0 kubenswrapper[6932]: I0319 11:53:30.800547 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 11:53:30.804599 master-0 kubenswrapper[6932]: I0319 11:53:30.800602 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 11:53:30.804599 master-0 kubenswrapper[6932]: I0319 11:53:30.801064 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 11:53:30.804599 master-0 kubenswrapper[6932]: I0319 11:53:30.801273 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 11:53:30.804599 master-0 kubenswrapper[6932]: I0319 11:53:30.802220 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 11:53:30.897116 master-0 kubenswrapper[6932]: I0319 11:53:30.897021 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-etcd-serving-ca\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:30.897785 master-0 kubenswrapper[6932]: I0319 11:53:30.897120 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-audit-policies\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:30.897785 master-0 kubenswrapper[6932]: I0319 11:53:30.897192 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-serving-cert\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:30.897785 master-0 kubenswrapper[6932]: I0319 11:53:30.897274 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-audit-dir\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:30.897785 master-0 kubenswrapper[6932]: I0319 11:53:30.897310 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt4fx\" (UniqueName: \"kubernetes.io/projected/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-kube-api-access-dt4fx\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:30.897785 master-0 kubenswrapper[6932]: I0319 11:53:30.897346 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-trusted-ca-bundle\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:30.897785 master-0 kubenswrapper[6932]: I0319 11:53:30.897392 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-encryption-config\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:30.897785 master-0 kubenswrapper[6932]: I0319 11:53:30.897410 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-etcd-client\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:30.898110 master-0 kubenswrapper[6932]: I0319 11:53:30.897944 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-audit-dir\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:30.898363 master-0 kubenswrapper[6932]: I0319 11:53:30.898339 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-etcd-serving-ca\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:30.898511 master-0 kubenswrapper[6932]: E0319 11:53:30.898448 6932 secret.go:189] Couldn't get secret openshift-oauth-apiserver/serving-cert: secret "serving-cert" not found Mar 19 11:53:30.899590 master-0 kubenswrapper[6932]: E0319 11:53:30.898577 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-serving-cert podName:69a2593c-e0f5-4e0b-9406-a96a3802c7cb nodeName:}" failed. No retries permitted until 2026-03-19 11:53:31.398477885 +0000 UTC m=+35.757538107 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-serving-cert") pod "apiserver-5547669f67-9ltgx" (UID: "69a2593c-e0f5-4e0b-9406-a96a3802c7cb") : secret "serving-cert" not found Mar 19 11:53:30.899590 master-0 kubenswrapper[6932]: E0319 11:53:30.898586 6932 secret.go:189] Couldn't get secret openshift-oauth-apiserver/etcd-client: secret "etcd-client" not found Mar 19 11:53:30.899590 master-0 kubenswrapper[6932]: E0319 11:53:30.898768 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-etcd-client podName:69a2593c-e0f5-4e0b-9406-a96a3802c7cb nodeName:}" failed. No retries permitted until 2026-03-19 11:53:31.398710151 +0000 UTC m=+35.757770373 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-etcd-client") pod "apiserver-5547669f67-9ltgx" (UID: "69a2593c-e0f5-4e0b-9406-a96a3802c7cb") : secret "etcd-client" not found Mar 19 11:53:30.899590 master-0 kubenswrapper[6932]: I0319 11:53:30.898896 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-trusted-ca-bundle\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:30.899590 master-0 kubenswrapper[6932]: I0319 11:53:30.899407 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-audit-policies\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:30.925763 master-0 kubenswrapper[6932]: I0319 11:53:30.917938 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-encryption-config\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:30.925763 master-0 kubenswrapper[6932]: I0319 11:53:30.918996 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt4fx\" (UniqueName: \"kubernetes.io/projected/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-kube-api-access-dt4fx\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:31.103587 master-0 kubenswrapper[6932]: I0319 11:53:31.099930 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-audit\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:31.103587 master-0 kubenswrapper[6932]: I0319 11:53:31.100021 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-etcd-client\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:31.103587 master-0 kubenswrapper[6932]: E0319 11:53:31.100217 6932 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 19 11:53:31.103587 master-0 kubenswrapper[6932]: I0319 11:53:31.100329 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-serving-cert\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:31.103587 master-0 kubenswrapper[6932]: E0319 11:53:31.100389 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-audit podName:e8048701-c79a-4112-9a61-33bf9fb01a62 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:33.100364285 +0000 UTC m=+37.459424527 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-audit") pod "apiserver-5848f5856-st8kj" (UID: "e8048701-c79a-4112-9a61-33bf9fb01a62") : configmap "audit-0" not found Mar 19 11:53:31.103587 master-0 kubenswrapper[6932]: E0319 11:53:31.100483 6932 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 19 11:53:31.103587 master-0 kubenswrapper[6932]: E0319 11:53:31.100555 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-serving-cert podName:e8048701-c79a-4112-9a61-33bf9fb01a62 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:33.100534789 +0000 UTC m=+37.459595011 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-serving-cert") pod "apiserver-5848f5856-st8kj" (UID: "e8048701-c79a-4112-9a61-33bf9fb01a62") : secret "serving-cert" not found Mar 19 11:53:31.104611 master-0 kubenswrapper[6932]: I0319 11:53:31.104359 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-etcd-client\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:31.153719 master-0 kubenswrapper[6932]: I0319 11:53:31.153659 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" event={"ID":"163d6a3d-0080-4122-bb7a-17f6e63f00f0","Type":"ContainerStarted","Data":"8f6241ff25db322ca912b366aec02ce24e776e994e5454c053b2a00c5bd1a93b"} Mar 19 11:53:31.155448 master-0 kubenswrapper[6932]: I0319 11:53:31.155333 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" event={"ID":"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1","Type":"ContainerStarted","Data":"6502c99aaf4d4f945a08ddd70ddf47028a9961291a598bc4054d9498e0e3049e"} Mar 19 11:53:31.157351 master-0 kubenswrapper[6932]: I0319 11:53:31.157306 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" event={"ID":"22e10648-af7c-409e-b947-570e7d807e05","Type":"ContainerStarted","Data":"bca8253525b3cd943116e55714fdf37c6331867834b278964c5e6f5dd4c53fef"} Mar 19 11:53:31.302860 master-0 kubenswrapper[6932]: I0319 11:53:31.302799 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12ec81c5-bbfd-414b-8b1f-c814fcda5791-serving-cert\") pod \"controller-manager-6d4875b77f-58xfr\" (UID: \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\") " pod="openshift-controller-manager/controller-manager-6d4875b77f-58xfr" Mar 19 11:53:31.303442 master-0 kubenswrapper[6932]: I0319 11:53:31.302928 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-serving-cert\") pod \"route-controller-manager-9f85cf6f7-6kjz2\" (UID: \"abc21a83-e7d5-406f-a2b9-be189b0ef9a5\") " pod="openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2" Mar 19 11:53:31.303442 master-0 kubenswrapper[6932]: E0319 11:53:31.303102 6932 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:53:31.303442 master-0 kubenswrapper[6932]: E0319 11:53:31.303150 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12ec81c5-bbfd-414b-8b1f-c814fcda5791-serving-cert podName:12ec81c5-bbfd-414b-8b1f-c814fcda5791 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:35.303135513 +0000 UTC m=+39.662195735 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/12ec81c5-bbfd-414b-8b1f-c814fcda5791-serving-cert") pod "controller-manager-6d4875b77f-58xfr" (UID: "12ec81c5-bbfd-414b-8b1f-c814fcda5791") : secret "serving-cert" not found Mar 19 11:53:31.304212 master-0 kubenswrapper[6932]: E0319 11:53:31.304186 6932 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:53:31.304257 master-0 kubenswrapper[6932]: E0319 11:53:31.304224 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-serving-cert podName:abc21a83-e7d5-406f-a2b9-be189b0ef9a5 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:35.304213938 +0000 UTC m=+39.663274160 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-serving-cert") pod "route-controller-manager-9f85cf6f7-6kjz2" (UID: "abc21a83-e7d5-406f-a2b9-be189b0ef9a5") : secret "serving-cert" not found Mar 19 11:53:31.404415 master-0 kubenswrapper[6932]: I0319 11:53:31.404276 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-etcd-client\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:31.404415 master-0 kubenswrapper[6932]: I0319 11:53:31.404365 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-serving-cert\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:31.404767 master-0 kubenswrapper[6932]: E0319 11:53:31.404747 6932 secret.go:189] Couldn't get secret openshift-oauth-apiserver/serving-cert: secret "serving-cert" not found Mar 19 11:53:31.404831 master-0 kubenswrapper[6932]: E0319 11:53:31.404826 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-serving-cert podName:69a2593c-e0f5-4e0b-9406-a96a3802c7cb nodeName:}" failed. No retries permitted until 2026-03-19 11:53:32.40481039 +0000 UTC m=+36.763870662 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-serving-cert") pod "apiserver-5547669f67-9ltgx" (UID: "69a2593c-e0f5-4e0b-9406-a96a3802c7cb") : secret "serving-cert" not found Mar 19 11:53:31.410299 master-0 kubenswrapper[6932]: I0319 11:53:31.409632 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-etcd-client\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:32.399024 master-0 kubenswrapper[6932]: I0319 11:53:32.398973 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm"] Mar 19 11:53:32.399902 master-0 kubenswrapper[6932]: I0319 11:53:32.399803 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:53:32.411685 master-0 kubenswrapper[6932]: I0319 11:53:32.402944 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 19 11:53:32.411685 master-0 kubenswrapper[6932]: I0319 11:53:32.403427 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 19 11:53:32.415209 master-0 kubenswrapper[6932]: I0319 11:53:32.412981 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm"] Mar 19 11:53:32.423215 master-0 kubenswrapper[6932]: I0319 11:53:32.421574 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 19 11:53:32.460692 master-0 kubenswrapper[6932]: I0319 11:53:32.460265 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-serving-cert\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:32.460692 master-0 kubenswrapper[6932]: E0319 11:53:32.460502 6932 secret.go:189] Couldn't get secret openshift-oauth-apiserver/serving-cert: secret "serving-cert" not found Mar 19 11:53:32.460692 master-0 kubenswrapper[6932]: E0319 11:53:32.460552 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-serving-cert podName:69a2593c-e0f5-4e0b-9406-a96a3802c7cb nodeName:}" failed. No retries permitted until 2026-03-19 11:53:34.460538387 +0000 UTC m=+38.819598609 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-serving-cert") pod "apiserver-5547669f67-9ltgx" (UID: "69a2593c-e0f5-4e0b-9406-a96a3802c7cb") : secret "serving-cert" not found Mar 19 11:53:32.494404 master-0 kubenswrapper[6932]: I0319 11:53:32.494353 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq"] Mar 19 11:53:32.495462 master-0 kubenswrapper[6932]: I0319 11:53:32.495441 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:32.508869 master-0 kubenswrapper[6932]: I0319 11:53:32.507784 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 19 11:53:32.508869 master-0 kubenswrapper[6932]: I0319 11:53:32.507895 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 19 11:53:32.509396 master-0 kubenswrapper[6932]: I0319 11:53:32.509363 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 19 11:53:32.526101 master-0 kubenswrapper[6932]: I0319 11:53:32.524626 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq"] Mar 19 11:53:32.526101 master-0 kubenswrapper[6932]: I0319 11:53:32.525451 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 19 11:53:32.561534 master-0 kubenswrapper[6932]: I0319 11:53:32.561466 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1b94d1eb-1b80-4a14-b1c0-d9e192231352-ca-certs\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:53:32.561534 master-0 kubenswrapper[6932]: I0319 11:53:32.561535 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1b94d1eb-1b80-4a14-b1c0-d9e192231352-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:53:32.561829 master-0 kubenswrapper[6932]: I0319 11:53:32.561592 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1b94d1eb-1b80-4a14-b1c0-d9e192231352-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:53:32.561829 master-0 kubenswrapper[6932]: I0319 11:53:32.561663 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srlcl\" (UniqueName: \"kubernetes.io/projected/1b94d1eb-1b80-4a14-b1c0-d9e192231352-kube-api-access-srlcl\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:53:32.561914 master-0 kubenswrapper[6932]: I0319 11:53:32.561848 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1b94d1eb-1b80-4a14-b1c0-d9e192231352-cache\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:53:32.665739 master-0 kubenswrapper[6932]: I0319 11:53:32.663949 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1b94d1eb-1b80-4a14-b1c0-d9e192231352-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:53:32.665739 master-0 kubenswrapper[6932]: I0319 11:53:32.664176 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/376b18a9-5f33-44fd-a37b-20ab02c5e65d-cache\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:32.665739 master-0 kubenswrapper[6932]: I0319 11:53:32.664238 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srlcl\" (UniqueName: \"kubernetes.io/projected/1b94d1eb-1b80-4a14-b1c0-d9e192231352-kube-api-access-srlcl\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:53:32.665739 master-0 kubenswrapper[6932]: I0319 11:53:32.664264 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/376b18a9-5f33-44fd-a37b-20ab02c5e65d-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:32.665739 master-0 kubenswrapper[6932]: I0319 11:53:32.664287 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1b94d1eb-1b80-4a14-b1c0-d9e192231352-cache\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:53:32.665739 master-0 kubenswrapper[6932]: I0319 11:53:32.664307 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2hrw\" (UniqueName: \"kubernetes.io/projected/376b18a9-5f33-44fd-a37b-20ab02c5e65d-kube-api-access-f2hrw\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:32.665739 master-0 kubenswrapper[6932]: I0319 11:53:32.664979 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/376b18a9-5f33-44fd-a37b-20ab02c5e65d-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:32.665739 master-0 kubenswrapper[6932]: I0319 11:53:32.665037 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1b94d1eb-1b80-4a14-b1c0-d9e192231352-ca-certs\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:53:32.665739 master-0 kubenswrapper[6932]: I0319 11:53:32.665059 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/376b18a9-5f33-44fd-a37b-20ab02c5e65d-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:32.665739 master-0 kubenswrapper[6932]: I0319 11:53:32.665088 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/376b18a9-5f33-44fd-a37b-20ab02c5e65d-ca-certs\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:32.665739 master-0 kubenswrapper[6932]: I0319 11:53:32.665660 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1b94d1eb-1b80-4a14-b1c0-d9e192231352-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:53:32.666344 master-0 kubenswrapper[6932]: I0319 11:53:32.666015 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1b94d1eb-1b80-4a14-b1c0-d9e192231352-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:53:32.666344 master-0 kubenswrapper[6932]: I0319 11:53:32.666067 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1b94d1eb-1b80-4a14-b1c0-d9e192231352-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:53:32.666965 master-0 kubenswrapper[6932]: I0319 11:53:32.666936 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1b94d1eb-1b80-4a14-b1c0-d9e192231352-cache\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:53:32.673660 master-0 kubenswrapper[6932]: I0319 11:53:32.673599 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1b94d1eb-1b80-4a14-b1c0-d9e192231352-ca-certs\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:53:32.687157 master-0 kubenswrapper[6932]: I0319 11:53:32.687070 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srlcl\" (UniqueName: \"kubernetes.io/projected/1b94d1eb-1b80-4a14-b1c0-d9e192231352-kube-api-access-srlcl\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:53:32.730771 master-0 kubenswrapper[6932]: I0319 11:53:32.727490 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:53:32.767953 master-0 kubenswrapper[6932]: I0319 11:53:32.767313 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/376b18a9-5f33-44fd-a37b-20ab02c5e65d-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:32.767953 master-0 kubenswrapper[6932]: I0319 11:53:32.767383 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2hrw\" (UniqueName: \"kubernetes.io/projected/376b18a9-5f33-44fd-a37b-20ab02c5e65d-kube-api-access-f2hrw\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:32.767953 master-0 kubenswrapper[6932]: I0319 11:53:32.767406 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/376b18a9-5f33-44fd-a37b-20ab02c5e65d-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:32.767953 master-0 kubenswrapper[6932]: I0319 11:53:32.767452 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/376b18a9-5f33-44fd-a37b-20ab02c5e65d-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:32.767953 master-0 kubenswrapper[6932]: I0319 11:53:32.767484 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/376b18a9-5f33-44fd-a37b-20ab02c5e65d-ca-certs\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:32.767953 master-0 kubenswrapper[6932]: I0319 11:53:32.767582 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/376b18a9-5f33-44fd-a37b-20ab02c5e65d-cache\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:32.767953 master-0 kubenswrapper[6932]: I0319 11:53:32.767689 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/376b18a9-5f33-44fd-a37b-20ab02c5e65d-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:32.768533 master-0 kubenswrapper[6932]: E0319 11:53:32.768216 6932 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Mar 19 11:53:32.768533 master-0 kubenswrapper[6932]: E0319 11:53:32.768346 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/376b18a9-5f33-44fd-a37b-20ab02c5e65d-catalogserver-certs podName:376b18a9-5f33-44fd-a37b-20ab02c5e65d nodeName:}" failed. No retries permitted until 2026-03-19 11:53:33.268317457 +0000 UTC m=+37.627377679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/376b18a9-5f33-44fd-a37b-20ab02c5e65d-catalogserver-certs") pod "catalogd-controller-manager-6864dc98f7-xzxpq" (UID: "376b18a9-5f33-44fd-a37b-20ab02c5e65d") : secret "catalogserver-cert" not found Mar 19 11:53:32.768533 master-0 kubenswrapper[6932]: I0319 11:53:32.767910 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/376b18a9-5f33-44fd-a37b-20ab02c5e65d-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:32.769077 master-0 kubenswrapper[6932]: I0319 11:53:32.768857 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/376b18a9-5f33-44fd-a37b-20ab02c5e65d-cache\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:32.776008 master-0 kubenswrapper[6932]: I0319 11:53:32.775802 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/376b18a9-5f33-44fd-a37b-20ab02c5e65d-ca-certs\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:32.789090 master-0 kubenswrapper[6932]: I0319 11:53:32.788887 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2hrw\" (UniqueName: \"kubernetes.io/projected/376b18a9-5f33-44fd-a37b-20ab02c5e65d-kube-api-access-f2hrw\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:33.171075 master-0 kubenswrapper[6932]: I0319 11:53:33.170697 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-serving-cert\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:33.171536 master-0 kubenswrapper[6932]: E0319 11:53:33.171017 6932 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 19 11:53:33.171536 master-0 kubenswrapper[6932]: E0319 11:53:33.171264 6932 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 19 11:53:33.171536 master-0 kubenswrapper[6932]: E0319 11:53:33.171265 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-serving-cert podName:e8048701-c79a-4112-9a61-33bf9fb01a62 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:37.171234796 +0000 UTC m=+41.530295018 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-serving-cert") pod "apiserver-5848f5856-st8kj" (UID: "e8048701-c79a-4112-9a61-33bf9fb01a62") : secret "serving-cert" not found Mar 19 11:53:33.171536 master-0 kubenswrapper[6932]: E0319 11:53:33.171342 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-audit podName:e8048701-c79a-4112-9a61-33bf9fb01a62 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:37.171324278 +0000 UTC m=+41.530384500 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-audit") pod "apiserver-5848f5856-st8kj" (UID: "e8048701-c79a-4112-9a61-33bf9fb01a62") : configmap "audit-0" not found Mar 19 11:53:33.171536 master-0 kubenswrapper[6932]: I0319 11:53:33.171159 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-audit\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:33.273568 master-0 kubenswrapper[6932]: I0319 11:53:33.273193 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/376b18a9-5f33-44fd-a37b-20ab02c5e65d-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:33.273568 master-0 kubenswrapper[6932]: E0319 11:53:33.273455 6932 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Mar 19 11:53:33.273568 master-0 kubenswrapper[6932]: E0319 11:53:33.273502 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/376b18a9-5f33-44fd-a37b-20ab02c5e65d-catalogserver-certs podName:376b18a9-5f33-44fd-a37b-20ab02c5e65d nodeName:}" failed. No retries permitted until 2026-03-19 11:53:34.273488135 +0000 UTC m=+38.632548357 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/376b18a9-5f33-44fd-a37b-20ab02c5e65d-catalogserver-certs") pod "catalogd-controller-manager-6864dc98f7-xzxpq" (UID: "376b18a9-5f33-44fd-a37b-20ab02c5e65d") : secret "catalogserver-cert" not found Mar 19 11:53:34.335222 master-0 kubenswrapper[6932]: I0319 11:53:34.335161 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/376b18a9-5f33-44fd-a37b-20ab02c5e65d-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:34.335222 master-0 kubenswrapper[6932]: E0319 11:53:34.335283 6932 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Mar 19 11:53:34.336614 master-0 kubenswrapper[6932]: E0319 11:53:34.335358 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/376b18a9-5f33-44fd-a37b-20ab02c5e65d-catalogserver-certs podName:376b18a9-5f33-44fd-a37b-20ab02c5e65d nodeName:}" failed. No retries permitted until 2026-03-19 11:53:36.33533956 +0000 UTC m=+40.694399782 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/376b18a9-5f33-44fd-a37b-20ab02c5e65d-catalogserver-certs") pod "catalogd-controller-manager-6864dc98f7-xzxpq" (UID: "376b18a9-5f33-44fd-a37b-20ab02c5e65d") : secret "catalogserver-cert" not found Mar 19 11:53:34.537181 master-0 kubenswrapper[6932]: I0319 11:53:34.537109 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-serving-cert\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:34.574107 master-0 kubenswrapper[6932]: I0319 11:53:34.561846 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-serving-cert\") pod \"apiserver-5547669f67-9ltgx\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:34.737278 master-0 kubenswrapper[6932]: I0319 11:53:34.737226 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:34.879514 master-0 kubenswrapper[6932]: I0319 11:53:34.879433 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 19 11:53:34.881636 master-0 kubenswrapper[6932]: I0319 11:53:34.881609 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm"] Mar 19 11:53:34.881766 master-0 kubenswrapper[6932]: I0319 11:53:34.881743 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:53:34.888137 master-0 kubenswrapper[6932]: I0319 11:53:34.888077 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 19 11:53:34.942440 master-0 kubenswrapper[6932]: I0319 11:53:34.942096 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/96498b3d-c93f-4b42-a0aa-2afec3450b1d-var-lock\") pod \"installer-1-master-0\" (UID: \"96498b3d-c93f-4b42-a0aa-2afec3450b1d\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:53:34.942440 master-0 kubenswrapper[6932]: I0319 11:53:34.942442 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96498b3d-c93f-4b42-a0aa-2afec3450b1d-kube-api-access\") pod \"installer-1-master-0\" (UID: \"96498b3d-c93f-4b42-a0aa-2afec3450b1d\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:53:34.942787 master-0 kubenswrapper[6932]: I0319 11:53:34.942521 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/96498b3d-c93f-4b42-a0aa-2afec3450b1d-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"96498b3d-c93f-4b42-a0aa-2afec3450b1d\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:53:35.443516 master-0 kubenswrapper[6932]: I0319 11:53:35.439591 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/96498b3d-c93f-4b42-a0aa-2afec3450b1d-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"96498b3d-c93f-4b42-a0aa-2afec3450b1d\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:53:35.443516 master-0 kubenswrapper[6932]: I0319 11:53:35.439788 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-serving-cert\") pod \"route-controller-manager-9f85cf6f7-6kjz2\" (UID: \"abc21a83-e7d5-406f-a2b9-be189b0ef9a5\") " pod="openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2" Mar 19 11:53:35.443516 master-0 kubenswrapper[6932]: I0319 11:53:35.439827 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/96498b3d-c93f-4b42-a0aa-2afec3450b1d-var-lock\") pod \"installer-1-master-0\" (UID: \"96498b3d-c93f-4b42-a0aa-2afec3450b1d\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:53:35.443516 master-0 kubenswrapper[6932]: I0319 11:53:35.439865 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96498b3d-c93f-4b42-a0aa-2afec3450b1d-kube-api-access\") pod \"installer-1-master-0\" (UID: \"96498b3d-c93f-4b42-a0aa-2afec3450b1d\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:53:35.443516 master-0 kubenswrapper[6932]: I0319 11:53:35.439947 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12ec81c5-bbfd-414b-8b1f-c814fcda5791-serving-cert\") pod \"controller-manager-6d4875b77f-58xfr\" (UID: \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\") " pod="openshift-controller-manager/controller-manager-6d4875b77f-58xfr" Mar 19 11:53:35.443516 master-0 kubenswrapper[6932]: E0319 11:53:35.440089 6932 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:53:35.443516 master-0 kubenswrapper[6932]: E0319 11:53:35.440148 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12ec81c5-bbfd-414b-8b1f-c814fcda5791-serving-cert podName:12ec81c5-bbfd-414b-8b1f-c814fcda5791 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:43.44013176 +0000 UTC m=+47.799191982 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/12ec81c5-bbfd-414b-8b1f-c814fcda5791-serving-cert") pod "controller-manager-6d4875b77f-58xfr" (UID: "12ec81c5-bbfd-414b-8b1f-c814fcda5791") : secret "serving-cert" not found Mar 19 11:53:35.443516 master-0 kubenswrapper[6932]: I0319 11:53:35.440191 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/96498b3d-c93f-4b42-a0aa-2afec3450b1d-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"96498b3d-c93f-4b42-a0aa-2afec3450b1d\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:53:35.443516 master-0 kubenswrapper[6932]: I0319 11:53:35.440312 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/96498b3d-c93f-4b42-a0aa-2afec3450b1d-var-lock\") pod \"installer-1-master-0\" (UID: \"96498b3d-c93f-4b42-a0aa-2afec3450b1d\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:53:35.443516 master-0 kubenswrapper[6932]: E0319 11:53:35.440510 6932 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:53:35.443516 master-0 kubenswrapper[6932]: E0319 11:53:35.440579 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-serving-cert podName:abc21a83-e7d5-406f-a2b9-be189b0ef9a5 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:43.440554419 +0000 UTC m=+47.799614741 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-serving-cert") pod "route-controller-manager-9f85cf6f7-6kjz2" (UID: "abc21a83-e7d5-406f-a2b9-be189b0ef9a5") : secret "serving-cert" not found Mar 19 11:53:35.708568 master-0 kubenswrapper[6932]: I0319 11:53:35.705189 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 19 11:53:36.352635 master-0 kubenswrapper[6932]: I0319 11:53:36.352553 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/376b18a9-5f33-44fd-a37b-20ab02c5e65d-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:36.352898 master-0 kubenswrapper[6932]: E0319 11:53:36.352738 6932 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Mar 19 11:53:36.352898 master-0 kubenswrapper[6932]: E0319 11:53:36.352793 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/376b18a9-5f33-44fd-a37b-20ab02c5e65d-catalogserver-certs podName:376b18a9-5f33-44fd-a37b-20ab02c5e65d nodeName:}" failed. No retries permitted until 2026-03-19 11:53:40.352774119 +0000 UTC m=+44.711834341 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/376b18a9-5f33-44fd-a37b-20ab02c5e65d-catalogserver-certs") pod "catalogd-controller-manager-6864dc98f7-xzxpq" (UID: "376b18a9-5f33-44fd-a37b-20ab02c5e65d") : secret "catalogserver-cert" not found Mar 19 11:53:36.409380 master-0 kubenswrapper[6932]: W0319 11:53:36.409324 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b94d1eb_1b80_4a14_b1c0_d9e192231352.slice/crio-45a1f521d794b1ca367dab762c38f2fc2e98e9ba7d75ffaddb7fceef49fff20d WatchSource:0}: Error finding container 45a1f521d794b1ca367dab762c38f2fc2e98e9ba7d75ffaddb7fceef49fff20d: Status 404 returned error can't find the container with id 45a1f521d794b1ca367dab762c38f2fc2e98e9ba7d75ffaddb7fceef49fff20d Mar 19 11:53:36.415035 master-0 kubenswrapper[6932]: I0319 11:53:36.414991 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-5848f5856-st8kj"] Mar 19 11:53:36.415666 master-0 kubenswrapper[6932]: E0319 11:53:36.415636 6932 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[audit serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-apiserver/apiserver-5848f5856-st8kj" podUID="e8048701-c79a-4112-9a61-33bf9fb01a62" Mar 19 11:53:36.459757 master-0 kubenswrapper[6932]: I0319 11:53:36.458498 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96498b3d-c93f-4b42-a0aa-2afec3450b1d-kube-api-access\") pod \"installer-1-master-0\" (UID: \"96498b3d-c93f-4b42-a0aa-2afec3450b1d\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:53:36.494750 master-0 kubenswrapper[6932]: I0319 11:53:36.493296 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:36.494750 master-0 kubenswrapper[6932]: I0319 11:53:36.493804 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" event={"ID":"1b94d1eb-1b80-4a14-b1c0-d9e192231352","Type":"ContainerStarted","Data":"45a1f521d794b1ca367dab762c38f2fc2e98e9ba7d75ffaddb7fceef49fff20d"} Mar 19 11:53:36.530781 master-0 kubenswrapper[6932]: I0319 11:53:36.529804 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:36.570753 master-0 kubenswrapper[6932]: I0319 11:53:36.569080 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-config\") pod \"e8048701-c79a-4112-9a61-33bf9fb01a62\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " Mar 19 11:53:36.570753 master-0 kubenswrapper[6932]: I0319 11:53:36.569168 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e8048701-c79a-4112-9a61-33bf9fb01a62-audit-dir\") pod \"e8048701-c79a-4112-9a61-33bf9fb01a62\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " Mar 19 11:53:36.570753 master-0 kubenswrapper[6932]: I0319 11:53:36.569192 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e8048701-c79a-4112-9a61-33bf9fb01a62-node-pullsecrets\") pod \"e8048701-c79a-4112-9a61-33bf9fb01a62\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " Mar 19 11:53:36.570753 master-0 kubenswrapper[6932]: I0319 11:53:36.569226 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-image-import-ca\") pod \"e8048701-c79a-4112-9a61-33bf9fb01a62\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " Mar 19 11:53:36.570753 master-0 kubenswrapper[6932]: I0319 11:53:36.569269 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-trusted-ca-bundle\") pod \"e8048701-c79a-4112-9a61-33bf9fb01a62\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " Mar 19 11:53:36.570753 master-0 kubenswrapper[6932]: I0319 11:53:36.569298 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wwqd\" (UniqueName: \"kubernetes.io/projected/e8048701-c79a-4112-9a61-33bf9fb01a62-kube-api-access-8wwqd\") pod \"e8048701-c79a-4112-9a61-33bf9fb01a62\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " Mar 19 11:53:36.570753 master-0 kubenswrapper[6932]: I0319 11:53:36.569324 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-etcd-client\") pod \"e8048701-c79a-4112-9a61-33bf9fb01a62\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " Mar 19 11:53:36.570753 master-0 kubenswrapper[6932]: I0319 11:53:36.569343 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-encryption-config\") pod \"e8048701-c79a-4112-9a61-33bf9fb01a62\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " Mar 19 11:53:36.570753 master-0 kubenswrapper[6932]: I0319 11:53:36.569365 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-etcd-serving-ca\") pod \"e8048701-c79a-4112-9a61-33bf9fb01a62\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " Mar 19 11:53:36.570753 master-0 kubenswrapper[6932]: I0319 11:53:36.570207 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "e8048701-c79a-4112-9a61-33bf9fb01a62" (UID: "e8048701-c79a-4112-9a61-33bf9fb01a62"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:53:36.571662 master-0 kubenswrapper[6932]: I0319 11:53:36.571313 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8048701-c79a-4112-9a61-33bf9fb01a62-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "e8048701-c79a-4112-9a61-33bf9fb01a62" (UID: "e8048701-c79a-4112-9a61-33bf9fb01a62"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:53:36.571662 master-0 kubenswrapper[6932]: I0319 11:53:36.571397 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8048701-c79a-4112-9a61-33bf9fb01a62-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "e8048701-c79a-4112-9a61-33bf9fb01a62" (UID: "e8048701-c79a-4112-9a61-33bf9fb01a62"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:53:36.575585 master-0 kubenswrapper[6932]: I0319 11:53:36.572076 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-config" (OuterVolumeSpecName: "config") pod "e8048701-c79a-4112-9a61-33bf9fb01a62" (UID: "e8048701-c79a-4112-9a61-33bf9fb01a62"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:53:36.575841 master-0 kubenswrapper[6932]: I0319 11:53:36.575783 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "e8048701-c79a-4112-9a61-33bf9fb01a62" (UID: "e8048701-c79a-4112-9a61-33bf9fb01a62"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:53:36.578791 master-0 kubenswrapper[6932]: I0319 11:53:36.576103 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "e8048701-c79a-4112-9a61-33bf9fb01a62" (UID: "e8048701-c79a-4112-9a61-33bf9fb01a62"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:53:36.578791 master-0 kubenswrapper[6932]: I0319 11:53:36.577643 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "e8048701-c79a-4112-9a61-33bf9fb01a62" (UID: "e8048701-c79a-4112-9a61-33bf9fb01a62"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:53:36.580991 master-0 kubenswrapper[6932]: I0319 11:53:36.579414 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e8048701-c79a-4112-9a61-33bf9fb01a62" (UID: "e8048701-c79a-4112-9a61-33bf9fb01a62"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:53:36.600553 master-0 kubenswrapper[6932]: I0319 11:53:36.600494 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8048701-c79a-4112-9a61-33bf9fb01a62-kube-api-access-8wwqd" (OuterVolumeSpecName: "kube-api-access-8wwqd") pod "e8048701-c79a-4112-9a61-33bf9fb01a62" (UID: "e8048701-c79a-4112-9a61-33bf9fb01a62"). InnerVolumeSpecName "kube-api-access-8wwqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:53:36.671827 master-0 kubenswrapper[6932]: I0319 11:53:36.670317 6932 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:36.671827 master-0 kubenswrapper[6932]: I0319 11:53:36.670354 6932 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e8048701-c79a-4112-9a61-33bf9fb01a62-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:36.671827 master-0 kubenswrapper[6932]: I0319 11:53:36.670366 6932 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e8048701-c79a-4112-9a61-33bf9fb01a62-node-pullsecrets\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:36.671827 master-0 kubenswrapper[6932]: I0319 11:53:36.670376 6932 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-image-import-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:36.671827 master-0 kubenswrapper[6932]: I0319 11:53:36.670386 6932 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:36.671827 master-0 kubenswrapper[6932]: I0319 11:53:36.670395 6932 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wwqd\" (UniqueName: \"kubernetes.io/projected/e8048701-c79a-4112-9a61-33bf9fb01a62-kube-api-access-8wwqd\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:36.671827 master-0 kubenswrapper[6932]: I0319 11:53:36.670405 6932 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-etcd-client\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:36.671827 master-0 kubenswrapper[6932]: I0319 11:53:36.670413 6932 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-encryption-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:36.671827 master-0 kubenswrapper[6932]: I0319 11:53:36.670422 6932 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:36.718755 master-0 kubenswrapper[6932]: I0319 11:53:36.718444 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:53:37.177448 master-0 kubenswrapper[6932]: I0319 11:53:37.177397 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-audit\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:37.177783 master-0 kubenswrapper[6932]: E0319 11:53:37.177590 6932 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 19 11:53:37.177783 master-0 kubenswrapper[6932]: I0319 11:53:37.177665 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-serving-cert\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:37.177783 master-0 kubenswrapper[6932]: E0319 11:53:37.177674 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-audit podName:e8048701-c79a-4112-9a61-33bf9fb01a62 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:45.177658166 +0000 UTC m=+49.536718388 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-audit") pod "apiserver-5848f5856-st8kj" (UID: "e8048701-c79a-4112-9a61-33bf9fb01a62") : configmap "audit-0" not found Mar 19 11:53:37.180688 master-0 kubenswrapper[6932]: I0319 11:53:37.180643 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-serving-cert\") pod \"apiserver-5848f5856-st8kj\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:37.278795 master-0 kubenswrapper[6932]: I0319 11:53:37.278722 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-serving-cert\") pod \"e8048701-c79a-4112-9a61-33bf9fb01a62\" (UID: \"e8048701-c79a-4112-9a61-33bf9fb01a62\") " Mar 19 11:53:37.281367 master-0 kubenswrapper[6932]: I0319 11:53:37.281290 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e8048701-c79a-4112-9a61-33bf9fb01a62" (UID: "e8048701-c79a-4112-9a61-33bf9fb01a62"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:53:37.380422 master-0 kubenswrapper[6932]: I0319 11:53:37.380367 6932 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8048701-c79a-4112-9a61-33bf9fb01a62-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:37.496704 master-0 kubenswrapper[6932]: I0319 11:53:37.496655 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-5848f5856-st8kj" Mar 19 11:53:37.567664 master-0 kubenswrapper[6932]: I0319 11:53:37.567586 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-f67f6868b-chx8j"] Mar 19 11:53:37.569507 master-0 kubenswrapper[6932]: I0319 11:53:37.569483 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.573151 master-0 kubenswrapper[6932]: I0319 11:53:37.573121 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-5848f5856-st8kj"] Mar 19 11:53:37.584869 master-0 kubenswrapper[6932]: I0319 11:53:37.579430 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 11:53:37.584869 master-0 kubenswrapper[6932]: I0319 11:53:37.579517 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 11:53:37.584869 master-0 kubenswrapper[6932]: I0319 11:53:37.579766 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 11:53:37.584869 master-0 kubenswrapper[6932]: I0319 11:53:37.579952 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 11:53:37.584869 master-0 kubenswrapper[6932]: I0319 11:53:37.580070 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 11:53:37.584869 master-0 kubenswrapper[6932]: I0319 11:53:37.580158 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 11:53:37.584869 master-0 kubenswrapper[6932]: I0319 11:53:37.580808 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 11:53:37.584869 master-0 kubenswrapper[6932]: I0319 11:53:37.581221 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 11:53:37.587190 master-0 kubenswrapper[6932]: I0319 11:53:37.586981 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 11:53:37.591715 master-0 kubenswrapper[6932]: I0319 11:53:37.588580 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e48b5aa9-293e-4222-91ff-7640addeca4c-node-pullsecrets\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.591715 master-0 kubenswrapper[6932]: I0319 11:53:37.588641 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-etcd-serving-ca\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.591715 master-0 kubenswrapper[6932]: I0319 11:53:37.588669 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e48b5aa9-293e-4222-91ff-7640addeca4c-encryption-config\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.591715 master-0 kubenswrapper[6932]: I0319 11:53:37.588701 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-audit\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.591715 master-0 kubenswrapper[6932]: I0319 11:53:37.588720 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-config\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.591715 master-0 kubenswrapper[6932]: I0319 11:53:37.588765 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e48b5aa9-293e-4222-91ff-7640addeca4c-serving-cert\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.591715 master-0 kubenswrapper[6932]: I0319 11:53:37.588872 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e48b5aa9-293e-4222-91ff-7640addeca4c-audit-dir\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.591715 master-0 kubenswrapper[6932]: I0319 11:53:37.589112 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88ghj\" (UniqueName: \"kubernetes.io/projected/e48b5aa9-293e-4222-91ff-7640addeca4c-kube-api-access-88ghj\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.591715 master-0 kubenswrapper[6932]: I0319 11:53:37.589196 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e48b5aa9-293e-4222-91ff-7640addeca4c-etcd-client\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.591715 master-0 kubenswrapper[6932]: I0319 11:53:37.589217 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-image-import-ca\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.591715 master-0 kubenswrapper[6932]: I0319 11:53:37.589274 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-trusted-ca-bundle\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.595093 master-0 kubenswrapper[6932]: I0319 11:53:37.595020 6932 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-5848f5856-st8kj"] Mar 19 11:53:37.595356 master-0 kubenswrapper[6932]: I0319 11:53:37.595291 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 11:53:37.595587 master-0 kubenswrapper[6932]: I0319 11:53:37.595558 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-f67f6868b-chx8j"] Mar 19 11:53:37.613781 master-0 kubenswrapper[6932]: I0319 11:53:37.612497 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-oauth-apiserver/apiserver-5547669f67-9ltgx"] Mar 19 11:53:37.695830 master-0 kubenswrapper[6932]: I0319 11:53:37.694376 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e48b5aa9-293e-4222-91ff-7640addeca4c-node-pullsecrets\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.695830 master-0 kubenswrapper[6932]: I0319 11:53:37.694430 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e48b5aa9-293e-4222-91ff-7640addeca4c-encryption-config\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.695830 master-0 kubenswrapper[6932]: I0319 11:53:37.694452 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-etcd-serving-ca\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.695830 master-0 kubenswrapper[6932]: I0319 11:53:37.694487 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-audit\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.695830 master-0 kubenswrapper[6932]: I0319 11:53:37.694510 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-config\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.695830 master-0 kubenswrapper[6932]: I0319 11:53:37.694544 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e48b5aa9-293e-4222-91ff-7640addeca4c-serving-cert\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.695830 master-0 kubenswrapper[6932]: I0319 11:53:37.694580 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e48b5aa9-293e-4222-91ff-7640addeca4c-audit-dir\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.695830 master-0 kubenswrapper[6932]: I0319 11:53:37.694616 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88ghj\" (UniqueName: \"kubernetes.io/projected/e48b5aa9-293e-4222-91ff-7640addeca4c-kube-api-access-88ghj\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.695830 master-0 kubenswrapper[6932]: I0319 11:53:37.694641 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e48b5aa9-293e-4222-91ff-7640addeca4c-etcd-client\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.695830 master-0 kubenswrapper[6932]: I0319 11:53:37.694667 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-image-import-ca\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.695830 master-0 kubenswrapper[6932]: I0319 11:53:37.694703 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-trusted-ca-bundle\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.695830 master-0 kubenswrapper[6932]: I0319 11:53:37.694767 6932 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e8048701-c79a-4112-9a61-33bf9fb01a62-audit\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:37.696465 master-0 kubenswrapper[6932]: I0319 11:53:37.696072 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-trusted-ca-bundle\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.696465 master-0 kubenswrapper[6932]: I0319 11:53:37.696137 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e48b5aa9-293e-4222-91ff-7640addeca4c-node-pullsecrets\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.706775 master-0 kubenswrapper[6932]: I0319 11:53:37.705373 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e48b5aa9-293e-4222-91ff-7640addeca4c-serving-cert\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.706775 master-0 kubenswrapper[6932]: I0319 11:53:37.706055 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-etcd-serving-ca\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.706775 master-0 kubenswrapper[6932]: I0319 11:53:37.706521 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-audit\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.707052 master-0 kubenswrapper[6932]: I0319 11:53:37.707000 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-config\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.707554 master-0 kubenswrapper[6932]: I0319 11:53:37.707394 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e48b5aa9-293e-4222-91ff-7640addeca4c-audit-dir\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.707554 master-0 kubenswrapper[6932]: I0319 11:53:37.707527 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e48b5aa9-293e-4222-91ff-7640addeca4c-encryption-config\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.709279 master-0 kubenswrapper[6932]: I0319 11:53:37.708037 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-image-import-ca\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.712239 master-0 kubenswrapper[6932]: I0319 11:53:37.712202 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e48b5aa9-293e-4222-91ff-7640addeca4c-etcd-client\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.743864 master-0 kubenswrapper[6932]: I0319 11:53:37.743542 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88ghj\" (UniqueName: \"kubernetes.io/projected/e48b5aa9-293e-4222-91ff-7640addeca4c-kube-api-access-88ghj\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:37.883798 master-0 kubenswrapper[6932]: I0319 11:53:37.883460 6932 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8048701-c79a-4112-9a61-33bf9fb01a62" path="/var/lib/kubelet/pods/e8048701-c79a-4112-9a61-33bf9fb01a62/volumes" Mar 19 11:53:37.908621 master-0 kubenswrapper[6932]: I0319 11:53:37.908565 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:40.354940 master-0 kubenswrapper[6932]: I0319 11:53:40.354238 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/376b18a9-5f33-44fd-a37b-20ab02c5e65d-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:40.362827 master-0 kubenswrapper[6932]: I0319 11:53:40.362763 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/376b18a9-5f33-44fd-a37b-20ab02c5e65d-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:40.633563 master-0 kubenswrapper[6932]: I0319 11:53:40.633382 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:41.438630 master-0 kubenswrapper[6932]: I0319 11:53:41.414833 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 19 11:53:41.438630 master-0 kubenswrapper[6932]: I0319 11:53:41.415892 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 19 11:53:41.438630 master-0 kubenswrapper[6932]: I0319 11:53:41.419188 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 19 11:53:41.438630 master-0 kubenswrapper[6932]: I0319 11:53:41.432492 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 19 11:53:41.488107 master-0 kubenswrapper[6932]: I0319 11:53:41.480605 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6bde080b-3820-463f-a27d-9fb9a7843d5d-var-lock\") pod \"installer-1-master-0\" (UID: \"6bde080b-3820-463f-a27d-9fb9a7843d5d\") " pod="openshift-etcd/installer-1-master-0" Mar 19 11:53:41.488107 master-0 kubenswrapper[6932]: I0319 11:53:41.480716 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6bde080b-3820-463f-a27d-9fb9a7843d5d-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"6bde080b-3820-463f-a27d-9fb9a7843d5d\") " pod="openshift-etcd/installer-1-master-0" Mar 19 11:53:41.488107 master-0 kubenswrapper[6932]: I0319 11:53:41.480763 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bde080b-3820-463f-a27d-9fb9a7843d5d-kube-api-access\") pod \"installer-1-master-0\" (UID: \"6bde080b-3820-463f-a27d-9fb9a7843d5d\") " pod="openshift-etcd/installer-1-master-0" Mar 19 11:53:41.514657 master-0 kubenswrapper[6932]: I0319 11:53:41.512522 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 19 11:53:41.530028 master-0 kubenswrapper[6932]: I0319 11:53:41.529562 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" event={"ID":"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1","Type":"ContainerStarted","Data":"9f1ab97c85874a07ea21622a9c342afce5ca479bac8abbfa4f10b05fc521b17b"} Mar 19 11:53:41.584575 master-0 kubenswrapper[6932]: I0319 11:53:41.583502 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6bde080b-3820-463f-a27d-9fb9a7843d5d-var-lock\") pod \"installer-1-master-0\" (UID: \"6bde080b-3820-463f-a27d-9fb9a7843d5d\") " pod="openshift-etcd/installer-1-master-0" Mar 19 11:53:41.584575 master-0 kubenswrapper[6932]: I0319 11:53:41.584128 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6bde080b-3820-463f-a27d-9fb9a7843d5d-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"6bde080b-3820-463f-a27d-9fb9a7843d5d\") " pod="openshift-etcd/installer-1-master-0" Mar 19 11:53:41.584575 master-0 kubenswrapper[6932]: I0319 11:53:41.584161 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bde080b-3820-463f-a27d-9fb9a7843d5d-kube-api-access\") pod \"installer-1-master-0\" (UID: \"6bde080b-3820-463f-a27d-9fb9a7843d5d\") " pod="openshift-etcd/installer-1-master-0" Mar 19 11:53:41.584575 master-0 kubenswrapper[6932]: I0319 11:53:41.584651 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6bde080b-3820-463f-a27d-9fb9a7843d5d-var-lock\") pod \"installer-1-master-0\" (UID: \"6bde080b-3820-463f-a27d-9fb9a7843d5d\") " pod="openshift-etcd/installer-1-master-0" Mar 19 11:53:41.586770 master-0 kubenswrapper[6932]: I0319 11:53:41.585274 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6bde080b-3820-463f-a27d-9fb9a7843d5d-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"6bde080b-3820-463f-a27d-9fb9a7843d5d\") " pod="openshift-etcd/installer-1-master-0" Mar 19 11:53:42.200867 master-0 kubenswrapper[6932]: I0319 11:53:42.200612 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d4875b77f-58xfr"] Mar 19 11:53:42.201075 master-0 kubenswrapper[6932]: E0319 11:53:42.201000 6932 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-6d4875b77f-58xfr" podUID="12ec81c5-bbfd-414b-8b1f-c814fcda5791" Mar 19 11:53:42.203291 master-0 kubenswrapper[6932]: I0319 11:53:42.202679 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-f67f6868b-chx8j"] Mar 19 11:53:42.205409 master-0 kubenswrapper[6932]: I0319 11:53:42.205132 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 19 11:53:42.210147 master-0 kubenswrapper[6932]: I0319 11:53:42.208025 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq"] Mar 19 11:53:42.223225 master-0 kubenswrapper[6932]: I0319 11:53:42.221312 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bde080b-3820-463f-a27d-9fb9a7843d5d-kube-api-access\") pod \"installer-1-master-0\" (UID: \"6bde080b-3820-463f-a27d-9fb9a7843d5d\") " pod="openshift-etcd/installer-1-master-0" Mar 19 11:53:42.240944 master-0 kubenswrapper[6932]: I0319 11:53:42.234838 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2"] Mar 19 11:53:42.240944 master-0 kubenswrapper[6932]: E0319 11:53:42.235374 6932 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2" podUID="abc21a83-e7d5-406f-a2b9-be189b0ef9a5" Mar 19 11:53:42.240944 master-0 kubenswrapper[6932]: I0319 11:53:42.235688 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-oauth-apiserver/apiserver-5547669f67-9ltgx"] Mar 19 11:53:42.254757 master-0 kubenswrapper[6932]: W0319 11:53:42.251286 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod376b18a9_5f33_44fd_a37b_20ab02c5e65d.slice/crio-8e1fd7c8f094ce1e4302e058e811af6aae2e3addf7bd81aa94568f27af29f0c9 WatchSource:0}: Error finding container 8e1fd7c8f094ce1e4302e058e811af6aae2e3addf7bd81aa94568f27af29f0c9: Status 404 returned error can't find the container with id 8e1fd7c8f094ce1e4302e058e811af6aae2e3addf7bd81aa94568f27af29f0c9 Mar 19 11:53:42.431101 master-0 kubenswrapper[6932]: I0319 11:53:42.430780 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 19 11:53:42.539504 master-0 kubenswrapper[6932]: I0319 11:53:42.528104 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-x6mmm"] Mar 19 11:53:42.539504 master-0 kubenswrapper[6932]: I0319 11:53:42.528680 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.582294 master-0 kubenswrapper[6932]: I0319 11:53:42.579518 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" event={"ID":"163d6a3d-0080-4122-bb7a-17f6e63f00f0","Type":"ContainerStarted","Data":"f9468c3e1600a144d91d27a137ca22e8dab8af4aed5534ad99b6957d73bac349"} Mar 19 11:53:42.582294 master-0 kubenswrapper[6932]: I0319 11:53:42.579916 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" event={"ID":"163d6a3d-0080-4122-bb7a-17f6e63f00f0","Type":"ContainerStarted","Data":"a5a674d7299c49bd88f1c56fca174966ef4c28920edc64023b6ce41812e041c8"} Mar 19 11:53:42.601470 master-0 kubenswrapper[6932]: I0319 11:53:42.601368 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" event={"ID":"333047c4-aeca-410e-9393-ca4e74366921","Type":"ContainerStarted","Data":"785e2ba1caabc182ccb6e16b83cef1466b977a7b27b48ca8d4a2e38344896d2c"} Mar 19 11:53:42.608866 master-0 kubenswrapper[6932]: I0319 11:53:42.604463 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" event={"ID":"69a2593c-e0f5-4e0b-9406-a96a3802c7cb","Type":"ContainerStarted","Data":"0aa98b9120c0e97528b4d8961a0319c140af8ee3a5132f8e3a549ab86c741b48"} Mar 19 11:53:42.608866 master-0 kubenswrapper[6932]: I0319 11:53:42.607747 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-var-lib-kubelet\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.608866 master-0 kubenswrapper[6932]: I0319 11:53:42.607788 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-systemd\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.608866 master-0 kubenswrapper[6932]: I0319 11:53:42.607817 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-run\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.608866 master-0 kubenswrapper[6932]: I0319 11:53:42.608053 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-lib-modules\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.608866 master-0 kubenswrapper[6932]: I0319 11:53:42.608170 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-sysctl-conf\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.608866 master-0 kubenswrapper[6932]: I0319 11:53:42.608257 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-modprobe-d\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.608866 master-0 kubenswrapper[6932]: I0319 11:53:42.608291 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-sysctl-d\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.608866 master-0 kubenswrapper[6932]: I0319 11:53:42.608359 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-sysconfig\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.608866 master-0 kubenswrapper[6932]: I0319 11:53:42.608414 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-sys\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.608866 master-0 kubenswrapper[6932]: I0319 11:53:42.608600 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-kubernetes\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.608866 master-0 kubenswrapper[6932]: I0319 11:53:42.608657 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-host\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.608866 master-0 kubenswrapper[6932]: I0319 11:53:42.608714 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8376e1f9-ab05-42d4-aa66-284a167a9bfc-tmp\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.608866 master-0 kubenswrapper[6932]: I0319 11:53:42.608752 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-tuned\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.608866 master-0 kubenswrapper[6932]: I0319 11:53:42.608802 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7784\" (UniqueName: \"kubernetes.io/projected/8376e1f9-ab05-42d4-aa66-284a167a9bfc-kube-api-access-n7784\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.609517 master-0 kubenswrapper[6932]: I0319 11:53:42.609068 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" event={"ID":"1b94d1eb-1b80-4a14-b1c0-d9e192231352","Type":"ContainerStarted","Data":"8fcb298ecd66e79f2851c7b4502a7734938f56462fb5de52ed324ec2a3679f14"} Mar 19 11:53:42.618887 master-0 kubenswrapper[6932]: I0319 11:53:42.618742 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" event={"ID":"22e10648-af7c-409e-b947-570e7d807e05","Type":"ContainerStarted","Data":"b7962275b6bf457cce0b695a6d26259112f637c1039846c248e1f5dd85199b18"} Mar 19 11:53:42.621633 master-0 kubenswrapper[6932]: I0319 11:53:42.620503 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"96498b3d-c93f-4b42-a0aa-2afec3450b1d","Type":"ContainerStarted","Data":"b0eeab0f4b63d0b832bcb033f60d90bd7a9ab1aefa13cc2a83e1411234017f43"} Mar 19 11:53:42.634760 master-0 kubenswrapper[6932]: I0319 11:53:42.630109 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-f67f6868b-chx8j" event={"ID":"e48b5aa9-293e-4222-91ff-7640addeca4c","Type":"ContainerStarted","Data":"caa4e9bd96e874f51a79da89bbb64da72933b4ef3464772d351cf399d375866a"} Mar 19 11:53:42.642052 master-0 kubenswrapper[6932]: I0319 11:53:42.637150 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" event={"ID":"aaaaf539-bf61-44d7-8d47-97535b7aa1ba","Type":"ContainerStarted","Data":"055df0b3895c94327b9f571e49f68fdf7023c19cdd8211d0f76234eb07218a32"} Mar 19 11:53:42.671894 master-0 kubenswrapper[6932]: I0319 11:53:42.671071 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d4875b77f-58xfr" Mar 19 11:53:42.671894 master-0 kubenswrapper[6932]: I0319 11:53:42.671545 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" event={"ID":"376b18a9-5f33-44fd-a37b-20ab02c5e65d","Type":"ContainerStarted","Data":"8e1fd7c8f094ce1e4302e058e811af6aae2e3addf7bd81aa94568f27af29f0c9"} Mar 19 11:53:42.671894 master-0 kubenswrapper[6932]: I0319 11:53:42.671630 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2" Mar 19 11:53:42.710895 master-0 kubenswrapper[6932]: I0319 11:53:42.710581 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-var-lib-kubelet\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.710895 master-0 kubenswrapper[6932]: I0319 11:53:42.710634 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-systemd\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.710895 master-0 kubenswrapper[6932]: I0319 11:53:42.710688 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-run\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.710895 master-0 kubenswrapper[6932]: I0319 11:53:42.710778 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-lib-modules\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.710895 master-0 kubenswrapper[6932]: I0319 11:53:42.710805 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-sysctl-conf\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.710895 master-0 kubenswrapper[6932]: I0319 11:53:42.710867 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-modprobe-d\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.710895 master-0 kubenswrapper[6932]: I0319 11:53:42.710902 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-sysctl-d\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.712063 master-0 kubenswrapper[6932]: I0319 11:53:42.710975 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-sysconfig\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.712063 master-0 kubenswrapper[6932]: I0319 11:53:42.710998 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-sys\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.712063 master-0 kubenswrapper[6932]: I0319 11:53:42.711036 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-kubernetes\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.712063 master-0 kubenswrapper[6932]: I0319 11:53:42.711060 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-host\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.712063 master-0 kubenswrapper[6932]: I0319 11:53:42.711081 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8376e1f9-ab05-42d4-aa66-284a167a9bfc-tmp\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.712063 master-0 kubenswrapper[6932]: I0319 11:53:42.711107 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-tuned\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.726385 master-0 kubenswrapper[6932]: I0319 11:53:42.725836 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7784\" (UniqueName: \"kubernetes.io/projected/8376e1f9-ab05-42d4-aa66-284a167a9bfc-kube-api-access-n7784\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.733034 master-0 kubenswrapper[6932]: I0319 11:53:42.731356 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-kubernetes\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.733034 master-0 kubenswrapper[6932]: I0319 11:53:42.731542 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-sysconfig\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.733034 master-0 kubenswrapper[6932]: I0319 11:53:42.731657 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-sys\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.733034 master-0 kubenswrapper[6932]: I0319 11:53:42.731801 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-var-lib-kubelet\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.733034 master-0 kubenswrapper[6932]: I0319 11:53:42.732148 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-systemd\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.733034 master-0 kubenswrapper[6932]: I0319 11:53:42.732167 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-host\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.733034 master-0 kubenswrapper[6932]: I0319 11:53:42.732390 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-run\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.733410 master-0 kubenswrapper[6932]: I0319 11:53:42.733125 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-modprobe-d\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.733410 master-0 kubenswrapper[6932]: I0319 11:53:42.733171 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-sysctl-conf\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.733410 master-0 kubenswrapper[6932]: I0319 11:53:42.733280 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-sysctl-d\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.733410 master-0 kubenswrapper[6932]: I0319 11:53:42.733333 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-lib-modules\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.752887 master-0 kubenswrapper[6932]: I0319 11:53:42.751324 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-tuned\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.752887 master-0 kubenswrapper[6932]: I0319 11:53:42.752014 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d4875b77f-58xfr" Mar 19 11:53:42.757439 master-0 kubenswrapper[6932]: I0319 11:53:42.757030 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8376e1f9-ab05-42d4-aa66-284a167a9bfc-tmp\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.782801 master-0 kubenswrapper[6932]: I0319 11:53:42.760247 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7784\" (UniqueName: \"kubernetes.io/projected/8376e1f9-ab05-42d4-aa66-284a167a9bfc-kube-api-access-n7784\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.782801 master-0 kubenswrapper[6932]: I0319 11:53:42.764213 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2" Mar 19 11:53:42.814077 master-0 kubenswrapper[6932]: I0319 11:53:42.812674 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 19 11:53:42.830850 master-0 kubenswrapper[6932]: I0319 11:53:42.830681 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-client-ca\") pod \"abc21a83-e7d5-406f-a2b9-be189b0ef9a5\" (UID: \"abc21a83-e7d5-406f-a2b9-be189b0ef9a5\") " Mar 19 11:53:42.831448 master-0 kubenswrapper[6932]: I0319 11:53:42.831408 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2bq5\" (UniqueName: \"kubernetes.io/projected/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-kube-api-access-z2bq5\") pod \"abc21a83-e7d5-406f-a2b9-be189b0ef9a5\" (UID: \"abc21a83-e7d5-406f-a2b9-be189b0ef9a5\") " Mar 19 11:53:42.831850 master-0 kubenswrapper[6932]: I0319 11:53:42.831717 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-config\") pod \"abc21a83-e7d5-406f-a2b9-be189b0ef9a5\" (UID: \"abc21a83-e7d5-406f-a2b9-be189b0ef9a5\") " Mar 19 11:53:42.832169 master-0 kubenswrapper[6932]: I0319 11:53:42.832141 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12ec81c5-bbfd-414b-8b1f-c814fcda5791-client-ca\") pod \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\" (UID: \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\") " Mar 19 11:53:42.832214 master-0 kubenswrapper[6932]: I0319 11:53:42.832180 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12ec81c5-bbfd-414b-8b1f-c814fcda5791-proxy-ca-bundles\") pod \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\" (UID: \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\") " Mar 19 11:53:42.832355 master-0 kubenswrapper[6932]: I0319 11:53:42.832330 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12ec81c5-bbfd-414b-8b1f-c814fcda5791-config\") pod \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\" (UID: \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\") " Mar 19 11:53:42.832524 master-0 kubenswrapper[6932]: I0319 11:53:42.832499 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j7ll\" (UniqueName: \"kubernetes.io/projected/12ec81c5-bbfd-414b-8b1f-c814fcda5791-kube-api-access-6j7ll\") pod \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\" (UID: \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\") " Mar 19 11:53:42.833222 master-0 kubenswrapper[6932]: I0319 11:53:42.833153 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12ec81c5-bbfd-414b-8b1f-c814fcda5791-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "12ec81c5-bbfd-414b-8b1f-c814fcda5791" (UID: "12ec81c5-bbfd-414b-8b1f-c814fcda5791"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:53:42.833222 master-0 kubenswrapper[6932]: I0319 11:53:42.833185 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-client-ca" (OuterVolumeSpecName: "client-ca") pod "abc21a83-e7d5-406f-a2b9-be189b0ef9a5" (UID: "abc21a83-e7d5-406f-a2b9-be189b0ef9a5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:53:42.833467 master-0 kubenswrapper[6932]: I0319 11:53:42.833399 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12ec81c5-bbfd-414b-8b1f-c814fcda5791-config" (OuterVolumeSpecName: "config") pod "12ec81c5-bbfd-414b-8b1f-c814fcda5791" (UID: "12ec81c5-bbfd-414b-8b1f-c814fcda5791"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:53:42.834519 master-0 kubenswrapper[6932]: I0319 11:53:42.834480 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-config" (OuterVolumeSpecName: "config") pod "abc21a83-e7d5-406f-a2b9-be189b0ef9a5" (UID: "abc21a83-e7d5-406f-a2b9-be189b0ef9a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:53:42.834577 master-0 kubenswrapper[6932]: I0319 11:53:42.834486 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12ec81c5-bbfd-414b-8b1f-c814fcda5791-client-ca" (OuterVolumeSpecName: "client-ca") pod "12ec81c5-bbfd-414b-8b1f-c814fcda5791" (UID: "12ec81c5-bbfd-414b-8b1f-c814fcda5791"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:53:42.836647 master-0 kubenswrapper[6932]: I0319 11:53:42.836599 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12ec81c5-bbfd-414b-8b1f-c814fcda5791-kube-api-access-6j7ll" (OuterVolumeSpecName: "kube-api-access-6j7ll") pod "12ec81c5-bbfd-414b-8b1f-c814fcda5791" (UID: "12ec81c5-bbfd-414b-8b1f-c814fcda5791"). InnerVolumeSpecName "kube-api-access-6j7ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:53:42.838991 master-0 kubenswrapper[6932]: I0319 11:53:42.838954 6932 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:42.838991 master-0 kubenswrapper[6932]: I0319 11:53:42.838984 6932 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12ec81c5-bbfd-414b-8b1f-c814fcda5791-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:42.839096 master-0 kubenswrapper[6932]: I0319 11:53:42.838998 6932 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12ec81c5-bbfd-414b-8b1f-c814fcda5791-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:42.839096 master-0 kubenswrapper[6932]: I0319 11:53:42.839009 6932 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12ec81c5-bbfd-414b-8b1f-c814fcda5791-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:42.839096 master-0 kubenswrapper[6932]: I0319 11:53:42.839018 6932 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j7ll\" (UniqueName: \"kubernetes.io/projected/12ec81c5-bbfd-414b-8b1f-c814fcda5791-kube-api-access-6j7ll\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:42.839096 master-0 kubenswrapper[6932]: I0319 11:53:42.839029 6932 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:42.839532 master-0 kubenswrapper[6932]: I0319 11:53:42.839495 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-kube-api-access-z2bq5" (OuterVolumeSpecName: "kube-api-access-z2bq5") pod "abc21a83-e7d5-406f-a2b9-be189b0ef9a5" (UID: "abc21a83-e7d5-406f-a2b9-be189b0ef9a5"). InnerVolumeSpecName "kube-api-access-z2bq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:53:42.898708 master-0 kubenswrapper[6932]: I0319 11:53:42.898641 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:53:42.940880 master-0 kubenswrapper[6932]: I0319 11:53:42.940443 6932 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2bq5\" (UniqueName: \"kubernetes.io/projected/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-kube-api-access-z2bq5\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:43.155750 master-0 kubenswrapper[6932]: I0319 11:53:43.154403 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ztgjs"] Mar 19 11:53:43.174750 master-0 kubenswrapper[6932]: I0319 11:53:43.163341 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ztgjs" Mar 19 11:53:43.174750 master-0 kubenswrapper[6932]: I0319 11:53:43.167795 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 11:53:43.174750 master-0 kubenswrapper[6932]: I0319 11:53:43.168047 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 11:53:43.174750 master-0 kubenswrapper[6932]: I0319 11:53:43.169398 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ztgjs"] Mar 19 11:53:43.174750 master-0 kubenswrapper[6932]: I0319 11:53:43.169516 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 11:53:43.174750 master-0 kubenswrapper[6932]: I0319 11:53:43.169933 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 11:53:43.250815 master-0 kubenswrapper[6932]: I0319 11:53:43.250483 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:53:43.272847 master-0 kubenswrapper[6932]: I0319 11:53:43.269938 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c-metrics-tls\") pod \"dns-default-ztgjs\" (UID: \"1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c\") " pod="openshift-dns/dns-default-ztgjs" Mar 19 11:53:43.272847 master-0 kubenswrapper[6932]: I0319 11:53:43.270033 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c-config-volume\") pod \"dns-default-ztgjs\" (UID: \"1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c\") " pod="openshift-dns/dns-default-ztgjs" Mar 19 11:53:43.272847 master-0 kubenswrapper[6932]: I0319 11:53:43.270103 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mxjl\" (UniqueName: \"kubernetes.io/projected/1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c-kube-api-access-2mxjl\") pod \"dns-default-ztgjs\" (UID: \"1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c\") " pod="openshift-dns/dns-default-ztgjs" Mar 19 11:53:43.380574 master-0 kubenswrapper[6932]: I0319 11:53:43.380400 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c-metrics-tls\") pod \"dns-default-ztgjs\" (UID: \"1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c\") " pod="openshift-dns/dns-default-ztgjs" Mar 19 11:53:43.380574 master-0 kubenswrapper[6932]: I0319 11:53:43.380460 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c-config-volume\") pod \"dns-default-ztgjs\" (UID: \"1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c\") " pod="openshift-dns/dns-default-ztgjs" Mar 19 11:53:43.380574 master-0 kubenswrapper[6932]: I0319 11:53:43.380507 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mxjl\" (UniqueName: \"kubernetes.io/projected/1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c-kube-api-access-2mxjl\") pod \"dns-default-ztgjs\" (UID: \"1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c\") " pod="openshift-dns/dns-default-ztgjs" Mar 19 11:53:43.382741 master-0 kubenswrapper[6932]: I0319 11:53:43.382597 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c-config-volume\") pod \"dns-default-ztgjs\" (UID: \"1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c\") " pod="openshift-dns/dns-default-ztgjs" Mar 19 11:53:43.392123 master-0 kubenswrapper[6932]: I0319 11:53:43.392059 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c-metrics-tls\") pod \"dns-default-ztgjs\" (UID: \"1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c\") " pod="openshift-dns/dns-default-ztgjs" Mar 19 11:53:43.402328 master-0 kubenswrapper[6932]: I0319 11:53:43.402272 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mxjl\" (UniqueName: \"kubernetes.io/projected/1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c-kube-api-access-2mxjl\") pod \"dns-default-ztgjs\" (UID: \"1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c\") " pod="openshift-dns/dns-default-ztgjs" Mar 19 11:53:43.997946 master-0 kubenswrapper[6932]: I0319 11:53:43.997685 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ztgjs" Mar 19 11:53:43.999283 master-0 kubenswrapper[6932]: I0319 11:53:43.998488 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-pm77f"] Mar 19 11:53:43.999494 master-0 kubenswrapper[6932]: I0319 11:53:43.999401 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pm77f" Mar 19 11:53:44.001637 master-0 kubenswrapper[6932]: I0319 11:53:44.000566 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12ec81c5-bbfd-414b-8b1f-c814fcda5791-serving-cert\") pod \"controller-manager-6d4875b77f-58xfr\" (UID: \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\") " pod="openshift-controller-manager/controller-manager-6d4875b77f-58xfr" Mar 19 11:53:44.001637 master-0 kubenswrapper[6932]: I0319 11:53:44.000667 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-serving-cert\") pod \"route-controller-manager-9f85cf6f7-6kjz2\" (UID: \"abc21a83-e7d5-406f-a2b9-be189b0ef9a5\") " pod="openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2" Mar 19 11:53:44.006198 master-0 kubenswrapper[6932]: I0319 11:53:44.006151 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-serving-cert\") pod \"route-controller-manager-9f85cf6f7-6kjz2\" (UID: \"abc21a83-e7d5-406f-a2b9-be189b0ef9a5\") " pod="openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2" Mar 19 11:53:44.009152 master-0 kubenswrapper[6932]: I0319 11:53:44.009089 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12ec81c5-bbfd-414b-8b1f-c814fcda5791-serving-cert\") pod \"controller-manager-6d4875b77f-58xfr\" (UID: \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\") " pod="openshift-controller-manager/controller-manager-6d4875b77f-58xfr" Mar 19 11:53:44.069845 master-0 kubenswrapper[6932]: I0319 11:53:44.068952 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"6bde080b-3820-463f-a27d-9fb9a7843d5d","Type":"ContainerStarted","Data":"bdd2ba95a96b40f792db569b1a38d500c6161c9b6b35b6b22d8099e9a3a35339"} Mar 19 11:53:44.069845 master-0 kubenswrapper[6932]: I0319 11:53:44.069493 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"6bde080b-3820-463f-a27d-9fb9a7843d5d","Type":"ContainerStarted","Data":"89d6b9652bfd68fb0b68a832373fa141222adae111524f0fd223064e1824cd6a"} Mar 19 11:53:44.069845 master-0 kubenswrapper[6932]: I0319 11:53:44.069526 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:53:44.069845 master-0 kubenswrapper[6932]: I0319 11:53:44.069544 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" event={"ID":"1b94d1eb-1b80-4a14-b1c0-d9e192231352","Type":"ContainerStarted","Data":"b1e2d143823123ceb6bd23805a885b259bd2c9edae2f5dbbc3924a67acd5c3c1"} Mar 19 11:53:44.069845 master-0 kubenswrapper[6932]: I0319 11:53:44.069562 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:44.069845 master-0 kubenswrapper[6932]: I0319 11:53:44.069579 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" event={"ID":"376b18a9-5f33-44fd-a37b-20ab02c5e65d","Type":"ContainerStarted","Data":"4c09f5575088b49e0ef7e52a5eb347dfd8470474e6a6ff5bf019908a8d6b87bc"} Mar 19 11:53:44.069845 master-0 kubenswrapper[6932]: I0319 11:53:44.069592 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" event={"ID":"376b18a9-5f33-44fd-a37b-20ab02c5e65d","Type":"ContainerStarted","Data":"bebe23916813176b01b8f4cc2d1e2ed82e08f9e67464ee08d1a4acd826ab717f"} Mar 19 11:53:44.074255 master-0 kubenswrapper[6932]: I0319 11:53:44.074205 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" event={"ID":"22e10648-af7c-409e-b947-570e7d807e05","Type":"ContainerStarted","Data":"c79ab056f71ba031e71ae670068865583b292f1a5476eb2423d416fe7a2b14e8"} Mar 19 11:53:44.082093 master-0 kubenswrapper[6932]: I0319 11:53:44.078102 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" podStartSLOduration=12.078072234 podStartE2EDuration="12.078072234s" podCreationTimestamp="2026-03-19 11:53:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:53:44.075444476 +0000 UTC m=+48.434504698" watchObservedRunningTime="2026-03-19 11:53:44.078072234 +0000 UTC m=+48.437132456" Mar 19 11:53:44.082093 master-0 kubenswrapper[6932]: I0319 11:53:44.079902 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"96498b3d-c93f-4b42-a0aa-2afec3450b1d","Type":"ContainerStarted","Data":"6190c5ce6a4577438f33206ca4dae98830aac87d35d2a1c7d5b529f64a571efc"} Mar 19 11:53:44.082093 master-0 kubenswrapper[6932]: I0319 11:53:44.080063 6932 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-1-master-0" podUID="96498b3d-c93f-4b42-a0aa-2afec3450b1d" containerName="installer" containerID="cri-o://6190c5ce6a4577438f33206ca4dae98830aac87d35d2a1c7d5b529f64a571efc" gracePeriod=30 Mar 19 11:53:44.085653 master-0 kubenswrapper[6932]: I0319 11:53:44.085584 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" event={"ID":"8376e1f9-ab05-42d4-aa66-284a167a9bfc","Type":"ContainerStarted","Data":"36e68c79afb391ff1a81c97bb40fb6639b3122523f23b3991ebc27a18bff6513"} Mar 19 11:53:44.085653 master-0 kubenswrapper[6932]: I0319 11:53:44.085649 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" event={"ID":"8376e1f9-ab05-42d4-aa66-284a167a9bfc","Type":"ContainerStarted","Data":"7425f7c738b48f360d2c2c8e1f1acdfb59f2e4a70fd29f7b8bbb3e5fc2d8360a"} Mar 19 11:53:44.086190 master-0 kubenswrapper[6932]: I0319 11:53:44.086028 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2" Mar 19 11:53:44.086528 master-0 kubenswrapper[6932]: I0319 11:53:44.086499 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d4875b77f-58xfr" Mar 19 11:53:44.093983 master-0 kubenswrapper[6932]: I0319 11:53:44.093703 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-1-master-0" podStartSLOduration=3.093682206 podStartE2EDuration="3.093682206s" podCreationTimestamp="2026-03-19 11:53:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:53:44.092355126 +0000 UTC m=+48.451415368" watchObservedRunningTime="2026-03-19 11:53:44.093682206 +0000 UTC m=+48.452742428" Mar 19 11:53:44.102895 master-0 kubenswrapper[6932]: I0319 11:53:44.102790 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt6bf\" (UniqueName: \"kubernetes.io/projected/1c898657-f06b-44ab-95ff-53a324759ba1-kube-api-access-mt6bf\") pod \"node-resolver-pm77f\" (UID: \"1c898657-f06b-44ab-95ff-53a324759ba1\") " pod="openshift-dns/node-resolver-pm77f" Mar 19 11:53:44.103041 master-0 kubenswrapper[6932]: I0319 11:53:44.102958 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1c898657-f06b-44ab-95ff-53a324759ba1-hosts-file\") pod \"node-resolver-pm77f\" (UID: \"1c898657-f06b-44ab-95ff-53a324759ba1\") " pod="openshift-dns/node-resolver-pm77f" Mar 19 11:53:44.128273 master-0 kubenswrapper[6932]: I0319 11:53:44.128156 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-1-master-0" podStartSLOduration=10.12812319 podStartE2EDuration="10.12812319s" podCreationTimestamp="2026-03-19 11:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:53:44.126386251 +0000 UTC m=+48.485446503" watchObservedRunningTime="2026-03-19 11:53:44.12812319 +0000 UTC m=+48.487183412" Mar 19 11:53:44.170581 master-0 kubenswrapper[6932]: I0319 11:53:44.168981 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" podStartSLOduration=12.168963708 podStartE2EDuration="12.168963708s" podCreationTimestamp="2026-03-19 11:53:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:53:44.167017334 +0000 UTC m=+48.526077556" watchObservedRunningTime="2026-03-19 11:53:44.168963708 +0000 UTC m=+48.528023970" Mar 19 11:53:44.218367 master-0 kubenswrapper[6932]: I0319 11:53:44.215366 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" podStartSLOduration=2.21534599 podStartE2EDuration="2.21534599s" podCreationTimestamp="2026-03-19 11:53:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:53:44.201226283 +0000 UTC m=+48.560286515" watchObservedRunningTime="2026-03-19 11:53:44.21534599 +0000 UTC m=+48.574406212" Mar 19 11:53:44.231360 master-0 kubenswrapper[6932]: I0319 11:53:44.228421 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-serving-cert\") pod \"abc21a83-e7d5-406f-a2b9-be189b0ef9a5\" (UID: \"abc21a83-e7d5-406f-a2b9-be189b0ef9a5\") " Mar 19 11:53:44.231360 master-0 kubenswrapper[6932]: I0319 11:53:44.228574 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12ec81c5-bbfd-414b-8b1f-c814fcda5791-serving-cert\") pod \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\" (UID: \"12ec81c5-bbfd-414b-8b1f-c814fcda5791\") " Mar 19 11:53:44.231360 master-0 kubenswrapper[6932]: I0319 11:53:44.228802 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt6bf\" (UniqueName: \"kubernetes.io/projected/1c898657-f06b-44ab-95ff-53a324759ba1-kube-api-access-mt6bf\") pod \"node-resolver-pm77f\" (UID: \"1c898657-f06b-44ab-95ff-53a324759ba1\") " pod="openshift-dns/node-resolver-pm77f" Mar 19 11:53:44.231360 master-0 kubenswrapper[6932]: I0319 11:53:44.228852 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1c898657-f06b-44ab-95ff-53a324759ba1-hosts-file\") pod \"node-resolver-pm77f\" (UID: \"1c898657-f06b-44ab-95ff-53a324759ba1\") " pod="openshift-dns/node-resolver-pm77f" Mar 19 11:53:44.233691 master-0 kubenswrapper[6932]: I0319 11:53:44.233632 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1c898657-f06b-44ab-95ff-53a324759ba1-hosts-file\") pod \"node-resolver-pm77f\" (UID: \"1c898657-f06b-44ab-95ff-53a324759ba1\") " pod="openshift-dns/node-resolver-pm77f" Mar 19 11:53:44.243671 master-0 kubenswrapper[6932]: I0319 11:53:44.242796 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "abc21a83-e7d5-406f-a2b9-be189b0ef9a5" (UID: "abc21a83-e7d5-406f-a2b9-be189b0ef9a5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:53:44.243671 master-0 kubenswrapper[6932]: I0319 11:53:44.242918 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ec81c5-bbfd-414b-8b1f-c814fcda5791-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "12ec81c5-bbfd-414b-8b1f-c814fcda5791" (UID: "12ec81c5-bbfd-414b-8b1f-c814fcda5791"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:53:44.267256 master-0 kubenswrapper[6932]: I0319 11:53:44.267116 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt6bf\" (UniqueName: \"kubernetes.io/projected/1c898657-f06b-44ab-95ff-53a324759ba1-kube-api-access-mt6bf\") pod \"node-resolver-pm77f\" (UID: \"1c898657-f06b-44ab-95ff-53a324759ba1\") " pod="openshift-dns/node-resolver-pm77f" Mar 19 11:53:44.304498 master-0 kubenswrapper[6932]: I0319 11:53:44.304047 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 19 11:53:44.304902 master-0 kubenswrapper[6932]: I0319 11:53:44.304868 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:53:44.317894 master-0 kubenswrapper[6932]: I0319 11:53:44.315818 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ztgjs"] Mar 19 11:53:44.317894 master-0 kubenswrapper[6932]: I0319 11:53:44.315910 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 19 11:53:44.329915 master-0 kubenswrapper[6932]: I0319 11:53:44.329878 6932 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12ec81c5-bbfd-414b-8b1f-c814fcda5791-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:44.329915 master-0 kubenswrapper[6932]: I0319 11:53:44.329913 6932 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abc21a83-e7d5-406f-a2b9-be189b0ef9a5-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:44.383087 master-0 kubenswrapper[6932]: I0319 11:53:44.382998 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pm77f" Mar 19 11:53:44.421596 master-0 kubenswrapper[6932]: W0319 11:53:44.416373 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c898657_f06b_44ab_95ff_53a324759ba1.slice/crio-f7a356015607c77d353df6671f85d12adf9e42d7853bd37134503d15b666f482 WatchSource:0}: Error finding container f7a356015607c77d353df6671f85d12adf9e42d7853bd37134503d15b666f482: Status 404 returned error can't find the container with id f7a356015607c77d353df6671f85d12adf9e42d7853bd37134503d15b666f482 Mar 19 11:53:44.431774 master-0 kubenswrapper[6932]: I0319 11:53:44.430971 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe31583b-cf5c-47f4-9cd3-bb4964baae6e-kube-api-access\") pod \"installer-2-master-0\" (UID: \"fe31583b-cf5c-47f4-9cd3-bb4964baae6e\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:53:44.431774 master-0 kubenswrapper[6932]: I0319 11:53:44.431020 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fe31583b-cf5c-47f4-9cd3-bb4964baae6e-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"fe31583b-cf5c-47f4-9cd3-bb4964baae6e\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:53:44.431774 master-0 kubenswrapper[6932]: I0319 11:53:44.431063 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fe31583b-cf5c-47f4-9cd3-bb4964baae6e-var-lock\") pod \"installer-2-master-0\" (UID: \"fe31583b-cf5c-47f4-9cd3-bb4964baae6e\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:53:44.497788 master-0 kubenswrapper[6932]: I0319 11:53:44.497233 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-654cf7f8-7lm6v"] Mar 19 11:53:44.498918 master-0 kubenswrapper[6932]: I0319 11:53:44.498898 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" Mar 19 11:53:44.505248 master-0 kubenswrapper[6932]: I0319 11:53:44.505180 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 11:53:44.505902 master-0 kubenswrapper[6932]: I0319 11:53:44.505877 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 11:53:44.506164 master-0 kubenswrapper[6932]: I0319 11:53:44.505877 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 11:53:44.506232 master-0 kubenswrapper[6932]: I0319 11:53:44.506101 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 11:53:44.507219 master-0 kubenswrapper[6932]: I0319 11:53:44.506789 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 11:53:44.507867 master-0 kubenswrapper[6932]: I0319 11:53:44.507719 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d4875b77f-58xfr"] Mar 19 11:53:44.512773 master-0 kubenswrapper[6932]: I0319 11:53:44.512712 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 11:53:44.516526 master-0 kubenswrapper[6932]: I0319 11:53:44.516490 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-654cf7f8-7lm6v"] Mar 19 11:53:44.525352 master-0 kubenswrapper[6932]: I0319 11:53:44.525161 6932 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6d4875b77f-58xfr"] Mar 19 11:53:44.532676 master-0 kubenswrapper[6932]: I0319 11:53:44.532625 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe31583b-cf5c-47f4-9cd3-bb4964baae6e-kube-api-access\") pod \"installer-2-master-0\" (UID: \"fe31583b-cf5c-47f4-9cd3-bb4964baae6e\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:53:44.533059 master-0 kubenswrapper[6932]: I0319 11:53:44.532812 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fe31583b-cf5c-47f4-9cd3-bb4964baae6e-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"fe31583b-cf5c-47f4-9cd3-bb4964baae6e\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:53:44.533059 master-0 kubenswrapper[6932]: I0319 11:53:44.532864 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fe31583b-cf5c-47f4-9cd3-bb4964baae6e-var-lock\") pod \"installer-2-master-0\" (UID: \"fe31583b-cf5c-47f4-9cd3-bb4964baae6e\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:53:44.533059 master-0 kubenswrapper[6932]: I0319 11:53:44.532958 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fe31583b-cf5c-47f4-9cd3-bb4964baae6e-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"fe31583b-cf5c-47f4-9cd3-bb4964baae6e\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:53:44.533212 master-0 kubenswrapper[6932]: I0319 11:53:44.533134 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fe31583b-cf5c-47f4-9cd3-bb4964baae6e-var-lock\") pod \"installer-2-master-0\" (UID: \"fe31583b-cf5c-47f4-9cd3-bb4964baae6e\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:53:44.554302 master-0 kubenswrapper[6932]: I0319 11:53:44.554250 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe31583b-cf5c-47f4-9cd3-bb4964baae6e-kube-api-access\") pod \"installer-2-master-0\" (UID: \"fe31583b-cf5c-47f4-9cd3-bb4964baae6e\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:53:44.567833 master-0 kubenswrapper[6932]: I0319 11:53:44.567664 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2"] Mar 19 11:53:44.575560 master-0 kubenswrapper[6932]: I0319 11:53:44.575492 6932 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f85cf6f7-6kjz2"] Mar 19 11:53:44.626991 master-0 kubenswrapper[6932]: I0319 11:53:44.626877 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:53:44.634276 master-0 kubenswrapper[6932]: I0319 11:53:44.633832 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7d33d5f-797c-4491-a1e3-1506452d2aff-client-ca\") pod \"controller-manager-654cf7f8-7lm6v\" (UID: \"e7d33d5f-797c-4491-a1e3-1506452d2aff\") " pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" Mar 19 11:53:44.634276 master-0 kubenswrapper[6932]: I0319 11:53:44.633881 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7d33d5f-797c-4491-a1e3-1506452d2aff-serving-cert\") pod \"controller-manager-654cf7f8-7lm6v\" (UID: \"e7d33d5f-797c-4491-a1e3-1506452d2aff\") " pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" Mar 19 11:53:44.634276 master-0 kubenswrapper[6932]: I0319 11:53:44.633949 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7d33d5f-797c-4491-a1e3-1506452d2aff-config\") pod \"controller-manager-654cf7f8-7lm6v\" (UID: \"e7d33d5f-797c-4491-a1e3-1506452d2aff\") " pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" Mar 19 11:53:44.634276 master-0 kubenswrapper[6932]: I0319 11:53:44.633976 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7d33d5f-797c-4491-a1e3-1506452d2aff-proxy-ca-bundles\") pod \"controller-manager-654cf7f8-7lm6v\" (UID: \"e7d33d5f-797c-4491-a1e3-1506452d2aff\") " pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" Mar 19 11:53:44.634276 master-0 kubenswrapper[6932]: I0319 11:53:44.633995 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d9dz\" (UniqueName: \"kubernetes.io/projected/e7d33d5f-797c-4491-a1e3-1506452d2aff-kube-api-access-8d9dz\") pod \"controller-manager-654cf7f8-7lm6v\" (UID: \"e7d33d5f-797c-4491-a1e3-1506452d2aff\") " pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" Mar 19 11:53:44.742873 master-0 kubenswrapper[6932]: I0319 11:53:44.742618 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7d33d5f-797c-4491-a1e3-1506452d2aff-client-ca\") pod \"controller-manager-654cf7f8-7lm6v\" (UID: \"e7d33d5f-797c-4491-a1e3-1506452d2aff\") " pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" Mar 19 11:53:44.742873 master-0 kubenswrapper[6932]: I0319 11:53:44.742707 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7d33d5f-797c-4491-a1e3-1506452d2aff-serving-cert\") pod \"controller-manager-654cf7f8-7lm6v\" (UID: \"e7d33d5f-797c-4491-a1e3-1506452d2aff\") " pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" Mar 19 11:53:44.742873 master-0 kubenswrapper[6932]: I0319 11:53:44.742783 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7d33d5f-797c-4491-a1e3-1506452d2aff-config\") pod \"controller-manager-654cf7f8-7lm6v\" (UID: \"e7d33d5f-797c-4491-a1e3-1506452d2aff\") " pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" Mar 19 11:53:44.742873 master-0 kubenswrapper[6932]: I0319 11:53:44.742825 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7d33d5f-797c-4491-a1e3-1506452d2aff-proxy-ca-bundles\") pod \"controller-manager-654cf7f8-7lm6v\" (UID: \"e7d33d5f-797c-4491-a1e3-1506452d2aff\") " pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" Mar 19 11:53:44.742873 master-0 kubenswrapper[6932]: I0319 11:53:44.742851 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d9dz\" (UniqueName: \"kubernetes.io/projected/e7d33d5f-797c-4491-a1e3-1506452d2aff-kube-api-access-8d9dz\") pod \"controller-manager-654cf7f8-7lm6v\" (UID: \"e7d33d5f-797c-4491-a1e3-1506452d2aff\") " pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" Mar 19 11:53:44.744265 master-0 kubenswrapper[6932]: I0319 11:53:44.744213 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7d33d5f-797c-4491-a1e3-1506452d2aff-client-ca\") pod \"controller-manager-654cf7f8-7lm6v\" (UID: \"e7d33d5f-797c-4491-a1e3-1506452d2aff\") " pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" Mar 19 11:53:44.748144 master-0 kubenswrapper[6932]: I0319 11:53:44.747995 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7d33d5f-797c-4491-a1e3-1506452d2aff-config\") pod \"controller-manager-654cf7f8-7lm6v\" (UID: \"e7d33d5f-797c-4491-a1e3-1506452d2aff\") " pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" Mar 19 11:53:44.748658 master-0 kubenswrapper[6932]: I0319 11:53:44.748613 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7d33d5f-797c-4491-a1e3-1506452d2aff-proxy-ca-bundles\") pod \"controller-manager-654cf7f8-7lm6v\" (UID: \"e7d33d5f-797c-4491-a1e3-1506452d2aff\") " pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" Mar 19 11:53:44.751668 master-0 kubenswrapper[6932]: I0319 11:53:44.751622 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7d33d5f-797c-4491-a1e3-1506452d2aff-serving-cert\") pod \"controller-manager-654cf7f8-7lm6v\" (UID: \"e7d33d5f-797c-4491-a1e3-1506452d2aff\") " pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" Mar 19 11:53:44.766285 master-0 kubenswrapper[6932]: I0319 11:53:44.766228 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d9dz\" (UniqueName: \"kubernetes.io/projected/e7d33d5f-797c-4491-a1e3-1506452d2aff-kube-api-access-8d9dz\") pod \"controller-manager-654cf7f8-7lm6v\" (UID: \"e7d33d5f-797c-4491-a1e3-1506452d2aff\") " pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" Mar 19 11:53:44.846082 master-0 kubenswrapper[6932]: I0319 11:53:44.846006 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" Mar 19 11:53:44.893027 master-0 kubenswrapper[6932]: I0319 11:53:44.892211 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 19 11:53:45.102009 master-0 kubenswrapper[6932]: I0319 11:53:45.101558 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ztgjs" event={"ID":"1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c","Type":"ContainerStarted","Data":"d6045bc934b39d2e74e105cd2ee97a2d4e1d69429d08a4cbb80aeb107f492bc3"} Mar 19 11:53:45.112440 master-0 kubenswrapper[6932]: I0319 11:53:45.107167 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-654cf7f8-7lm6v"] Mar 19 11:53:45.123365 master-0 kubenswrapper[6932]: I0319 11:53:45.123304 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pm77f" event={"ID":"1c898657-f06b-44ab-95ff-53a324759ba1","Type":"ContainerStarted","Data":"e9f78a043a5364dfd8bb8c4aecfe1691cc8f8a178d0aa9496a8c3a6303ff52d5"} Mar 19 11:53:45.123488 master-0 kubenswrapper[6932]: I0319 11:53:45.123374 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pm77f" event={"ID":"1c898657-f06b-44ab-95ff-53a324759ba1","Type":"ContainerStarted","Data":"f7a356015607c77d353df6671f85d12adf9e42d7853bd37134503d15b666f482"} Mar 19 11:53:45.133332 master-0 kubenswrapper[6932]: I0319 11:53:45.133094 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"fe31583b-cf5c-47f4-9cd3-bb4964baae6e","Type":"ContainerStarted","Data":"1315fe63b992ebfde48753dbbe66d869fa11b59020e96d5bfd968d9ece99cc92"} Mar 19 11:53:45.136224 master-0 kubenswrapper[6932]: W0319 11:53:45.136197 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7d33d5f_797c_4491_a1e3_1506452d2aff.slice/crio-17ef7d3c5afdf38a9291cfc7bd1abddb4ccc85cfe41ec061ac20ffbde675f248 WatchSource:0}: Error finding container 17ef7d3c5afdf38a9291cfc7bd1abddb4ccc85cfe41ec061ac20ffbde675f248: Status 404 returned error can't find the container with id 17ef7d3c5afdf38a9291cfc7bd1abddb4ccc85cfe41ec061ac20ffbde675f248 Mar 19 11:53:45.144284 master-0 kubenswrapper[6932]: I0319 11:53:45.144214 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-pm77f" podStartSLOduration=2.144189285 podStartE2EDuration="2.144189285s" podCreationTimestamp="2026-03-19 11:53:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:53:45.142526958 +0000 UTC m=+49.501587200" watchObservedRunningTime="2026-03-19 11:53:45.144189285 +0000 UTC m=+49.503249507" Mar 19 11:53:45.877063 master-0 kubenswrapper[6932]: I0319 11:53:45.877007 6932 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12ec81c5-bbfd-414b-8b1f-c814fcda5791" path="/var/lib/kubelet/pods/12ec81c5-bbfd-414b-8b1f-c814fcda5791/volumes" Mar 19 11:53:45.877479 master-0 kubenswrapper[6932]: I0319 11:53:45.877446 6932 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abc21a83-e7d5-406f-a2b9-be189b0ef9a5" path="/var/lib/kubelet/pods/abc21a83-e7d5-406f-a2b9-be189b0ef9a5/volumes" Mar 19 11:53:46.141158 master-0 kubenswrapper[6932]: I0319 11:53:46.139847 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"fe31583b-cf5c-47f4-9cd3-bb4964baae6e","Type":"ContainerStarted","Data":"80609c53dc5ef9b4161df1ba3ad8361d008842c19903c035bffbf987b38faf4e"} Mar 19 11:53:46.146492 master-0 kubenswrapper[6932]: I0319 11:53:46.146423 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" event={"ID":"e7d33d5f-797c-4491-a1e3-1506452d2aff","Type":"ContainerStarted","Data":"17ef7d3c5afdf38a9291cfc7bd1abddb4ccc85cfe41ec061ac20ffbde675f248"} Mar 19 11:53:46.161159 master-0 kubenswrapper[6932]: I0319 11:53:46.161061 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-2-master-0" podStartSLOduration=2.161038828 podStartE2EDuration="2.161038828s" podCreationTimestamp="2026-03-19 11:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:53:46.156551217 +0000 UTC m=+50.515611469" watchObservedRunningTime="2026-03-19 11:53:46.161038828 +0000 UTC m=+50.520099050" Mar 19 11:53:47.011099 master-0 kubenswrapper[6932]: I0319 11:53:47.011022 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv"] Mar 19 11:53:47.011681 master-0 kubenswrapper[6932]: I0319 11:53:47.011651 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv" Mar 19 11:53:47.015982 master-0 kubenswrapper[6932]: I0319 11:53:47.015944 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 11:53:47.016054 master-0 kubenswrapper[6932]: I0319 11:53:47.016025 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 11:53:47.016104 master-0 kubenswrapper[6932]: I0319 11:53:47.016076 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 11:53:47.016196 master-0 kubenswrapper[6932]: I0319 11:53:47.016127 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 11:53:47.016243 master-0 kubenswrapper[6932]: I0319 11:53:47.016210 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 11:53:47.022116 master-0 kubenswrapper[6932]: I0319 11:53:47.022076 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv"] Mar 19 11:53:47.107294 master-0 kubenswrapper[6932]: I0319 11:53:47.107241 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57761fc9-b8d5-4dd5-8882-9f33ef79111a-serving-cert\") pod \"route-controller-manager-f6c859bf8-g2lgv\" (UID: \"57761fc9-b8d5-4dd5-8882-9f33ef79111a\") " pod="openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv" Mar 19 11:53:47.107520 master-0 kubenswrapper[6932]: I0319 11:53:47.107325 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd5w8\" (UniqueName: \"kubernetes.io/projected/57761fc9-b8d5-4dd5-8882-9f33ef79111a-kube-api-access-vd5w8\") pod \"route-controller-manager-f6c859bf8-g2lgv\" (UID: \"57761fc9-b8d5-4dd5-8882-9f33ef79111a\") " pod="openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv" Mar 19 11:53:47.107520 master-0 kubenswrapper[6932]: I0319 11:53:47.107376 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57761fc9-b8d5-4dd5-8882-9f33ef79111a-client-ca\") pod \"route-controller-manager-f6c859bf8-g2lgv\" (UID: \"57761fc9-b8d5-4dd5-8882-9f33ef79111a\") " pod="openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv" Mar 19 11:53:47.107520 master-0 kubenswrapper[6932]: I0319 11:53:47.107423 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57761fc9-b8d5-4dd5-8882-9f33ef79111a-config\") pod \"route-controller-manager-f6c859bf8-g2lgv\" (UID: \"57761fc9-b8d5-4dd5-8882-9f33ef79111a\") " pod="openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv" Mar 19 11:53:47.208389 master-0 kubenswrapper[6932]: I0319 11:53:47.208332 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57761fc9-b8d5-4dd5-8882-9f33ef79111a-config\") pod \"route-controller-manager-f6c859bf8-g2lgv\" (UID: \"57761fc9-b8d5-4dd5-8882-9f33ef79111a\") " pod="openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv" Mar 19 11:53:47.209036 master-0 kubenswrapper[6932]: I0319 11:53:47.208533 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57761fc9-b8d5-4dd5-8882-9f33ef79111a-serving-cert\") pod \"route-controller-manager-f6c859bf8-g2lgv\" (UID: \"57761fc9-b8d5-4dd5-8882-9f33ef79111a\") " pod="openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv" Mar 19 11:53:47.209036 master-0 kubenswrapper[6932]: I0319 11:53:47.208596 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd5w8\" (UniqueName: \"kubernetes.io/projected/57761fc9-b8d5-4dd5-8882-9f33ef79111a-kube-api-access-vd5w8\") pod \"route-controller-manager-f6c859bf8-g2lgv\" (UID: \"57761fc9-b8d5-4dd5-8882-9f33ef79111a\") " pod="openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv" Mar 19 11:53:47.209036 master-0 kubenswrapper[6932]: I0319 11:53:47.208635 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57761fc9-b8d5-4dd5-8882-9f33ef79111a-client-ca\") pod \"route-controller-manager-f6c859bf8-g2lgv\" (UID: \"57761fc9-b8d5-4dd5-8882-9f33ef79111a\") " pod="openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv" Mar 19 11:53:47.209514 master-0 kubenswrapper[6932]: I0319 11:53:47.209491 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57761fc9-b8d5-4dd5-8882-9f33ef79111a-client-ca\") pod \"route-controller-manager-f6c859bf8-g2lgv\" (UID: \"57761fc9-b8d5-4dd5-8882-9f33ef79111a\") " pod="openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv" Mar 19 11:53:47.209919 master-0 kubenswrapper[6932]: I0319 11:53:47.209877 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57761fc9-b8d5-4dd5-8882-9f33ef79111a-config\") pod \"route-controller-manager-f6c859bf8-g2lgv\" (UID: \"57761fc9-b8d5-4dd5-8882-9f33ef79111a\") " pod="openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv" Mar 19 11:53:47.215400 master-0 kubenswrapper[6932]: I0319 11:53:47.215372 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57761fc9-b8d5-4dd5-8882-9f33ef79111a-serving-cert\") pod \"route-controller-manager-f6c859bf8-g2lgv\" (UID: \"57761fc9-b8d5-4dd5-8882-9f33ef79111a\") " pod="openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv" Mar 19 11:53:47.620881 master-0 kubenswrapper[6932]: I0319 11:53:47.620802 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd5w8\" (UniqueName: \"kubernetes.io/projected/57761fc9-b8d5-4dd5-8882-9f33ef79111a-kube-api-access-vd5w8\") pod \"route-controller-manager-f6c859bf8-g2lgv\" (UID: \"57761fc9-b8d5-4dd5-8882-9f33ef79111a\") " pod="openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv" Mar 19 11:53:47.691721 master-0 kubenswrapper[6932]: I0319 11:53:47.691645 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv" Mar 19 11:53:50.147672 master-0 kubenswrapper[6932]: I0319 11:53:50.147594 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv"] Mar 19 11:53:50.159063 master-0 kubenswrapper[6932]: I0319 11:53:50.159003 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 19 11:53:50.160130 master-0 kubenswrapper[6932]: I0319 11:53:50.160057 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:53:50.162164 master-0 kubenswrapper[6932]: I0319 11:53:50.162127 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 11:53:50.181219 master-0 kubenswrapper[6932]: I0319 11:53:50.180968 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 19 11:53:50.185514 master-0 kubenswrapper[6932]: I0319 11:53:50.184281 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" event={"ID":"e7d33d5f-797c-4491-a1e3-1506452d2aff","Type":"ContainerStarted","Data":"a702a8b2551a8cc2cd855896867355537b670e597276dfea5a1cc7a37a886b74"} Mar 19 11:53:50.187580 master-0 kubenswrapper[6932]: I0319 11:53:50.187539 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" Mar 19 11:53:50.201317 master-0 kubenswrapper[6932]: I0319 11:53:50.201256 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" Mar 19 11:53:50.206902 master-0 kubenswrapper[6932]: I0319 11:53:50.206839 6932 generic.go:334] "Generic (PLEG): container finished" podID="e48b5aa9-293e-4222-91ff-7640addeca4c" containerID="0041aa33e170f47251865926ed112bdffedc66315fe41f5f63242817433881b1" exitCode=0 Mar 19 11:53:50.207020 master-0 kubenswrapper[6932]: I0319 11:53:50.206937 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-f67f6868b-chx8j" event={"ID":"e48b5aa9-293e-4222-91ff-7640addeca4c","Type":"ContainerDied","Data":"0041aa33e170f47251865926ed112bdffedc66315fe41f5f63242817433881b1"} Mar 19 11:53:50.211855 master-0 kubenswrapper[6932]: I0319 11:53:50.211792 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" podStartSLOduration=3.597141579 podStartE2EDuration="8.211769414s" podCreationTimestamp="2026-03-19 11:53:42 +0000 UTC" firstStartedPulling="2026-03-19 11:53:45.139807367 +0000 UTC m=+49.498867589" lastFinishedPulling="2026-03-19 11:53:49.754435202 +0000 UTC m=+54.113495424" observedRunningTime="2026-03-19 11:53:50.210876485 +0000 UTC m=+54.569936717" watchObservedRunningTime="2026-03-19 11:53:50.211769414 +0000 UTC m=+54.570829626" Mar 19 11:53:50.214642 master-0 kubenswrapper[6932]: I0319 11:53:50.214604 6932 generic.go:334] "Generic (PLEG): container finished" podID="69a2593c-e0f5-4e0b-9406-a96a3802c7cb" containerID="9778989e23f8e62b1c5af9186416eb8c7c3c13efba2cab28ce54c71fc9ada2c1" exitCode=0 Mar 19 11:53:50.214766 master-0 kubenswrapper[6932]: I0319 11:53:50.214673 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" event={"ID":"69a2593c-e0f5-4e0b-9406-a96a3802c7cb","Type":"ContainerDied","Data":"9778989e23f8e62b1c5af9186416eb8c7c3c13efba2cab28ce54c71fc9ada2c1"} Mar 19 11:53:50.218668 master-0 kubenswrapper[6932]: I0319 11:53:50.218614 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ztgjs" event={"ID":"1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c","Type":"ContainerStarted","Data":"ffdc964a7955e48a0a3e7953d308aed48959c213a8f5342f17c50c088af2154c"} Mar 19 11:53:50.253608 master-0 kubenswrapper[6932]: I0319 11:53:50.253456 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e508a43-99db-49eb-bf4e-e3e6a0f49761-var-lock\") pod \"installer-1-master-0\" (UID: \"8e508a43-99db-49eb-bf4e-e3e6a0f49761\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:53:50.253608 master-0 kubenswrapper[6932]: I0319 11:53:50.253530 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e508a43-99db-49eb-bf4e-e3e6a0f49761-kube-api-access\") pod \"installer-1-master-0\" (UID: \"8e508a43-99db-49eb-bf4e-e3e6a0f49761\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:53:50.255030 master-0 kubenswrapper[6932]: I0319 11:53:50.254975 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e508a43-99db-49eb-bf4e-e3e6a0f49761-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"8e508a43-99db-49eb-bf4e-e3e6a0f49761\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:53:50.357461 master-0 kubenswrapper[6932]: I0319 11:53:50.357065 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e508a43-99db-49eb-bf4e-e3e6a0f49761-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"8e508a43-99db-49eb-bf4e-e3e6a0f49761\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:53:50.357461 master-0 kubenswrapper[6932]: I0319 11:53:50.357131 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e508a43-99db-49eb-bf4e-e3e6a0f49761-var-lock\") pod \"installer-1-master-0\" (UID: \"8e508a43-99db-49eb-bf4e-e3e6a0f49761\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:53:50.357461 master-0 kubenswrapper[6932]: I0319 11:53:50.357156 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e508a43-99db-49eb-bf4e-e3e6a0f49761-kube-api-access\") pod \"installer-1-master-0\" (UID: \"8e508a43-99db-49eb-bf4e-e3e6a0f49761\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:53:50.357461 master-0 kubenswrapper[6932]: I0319 11:53:50.357386 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e508a43-99db-49eb-bf4e-e3e6a0f49761-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"8e508a43-99db-49eb-bf4e-e3e6a0f49761\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:53:50.357461 master-0 kubenswrapper[6932]: I0319 11:53:50.357464 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e508a43-99db-49eb-bf4e-e3e6a0f49761-var-lock\") pod \"installer-1-master-0\" (UID: \"8e508a43-99db-49eb-bf4e-e3e6a0f49761\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:53:50.377484 master-0 kubenswrapper[6932]: I0319 11:53:50.377426 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e508a43-99db-49eb-bf4e-e3e6a0f49761-kube-api-access\") pod \"installer-1-master-0\" (UID: \"8e508a43-99db-49eb-bf4e-e3e6a0f49761\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:53:50.503696 master-0 kubenswrapper[6932]: I0319 11:53:50.503604 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:53:50.519845 master-0 kubenswrapper[6932]: I0319 11:53:50.518981 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:50.561263 master-0 kubenswrapper[6932]: I0319 11:53:50.560964 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-audit-policies\") pod \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " Mar 19 11:53:50.561263 master-0 kubenswrapper[6932]: I0319 11:53:50.561105 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dt4fx\" (UniqueName: \"kubernetes.io/projected/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-kube-api-access-dt4fx\") pod \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " Mar 19 11:53:50.561263 master-0 kubenswrapper[6932]: I0319 11:53:50.561162 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-etcd-serving-ca\") pod \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " Mar 19 11:53:50.562009 master-0 kubenswrapper[6932]: I0319 11:53:50.561835 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "69a2593c-e0f5-4e0b-9406-a96a3802c7cb" (UID: "69a2593c-e0f5-4e0b-9406-a96a3802c7cb"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:53:50.562080 master-0 kubenswrapper[6932]: I0319 11:53:50.562054 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-etcd-client\") pod \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " Mar 19 11:53:50.562110 master-0 kubenswrapper[6932]: I0319 11:53:50.562099 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-encryption-config\") pod \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " Mar 19 11:53:50.562164 master-0 kubenswrapper[6932]: I0319 11:53:50.562148 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-audit-dir\") pod \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " Mar 19 11:53:50.562197 master-0 kubenswrapper[6932]: I0319 11:53:50.562143 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "69a2593c-e0f5-4e0b-9406-a96a3802c7cb" (UID: "69a2593c-e0f5-4e0b-9406-a96a3802c7cb"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:53:50.562197 master-0 kubenswrapper[6932]: I0319 11:53:50.562189 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-trusted-ca-bundle\") pod \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " Mar 19 11:53:50.562255 master-0 kubenswrapper[6932]: I0319 11:53:50.562224 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-serving-cert\") pod \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\" (UID: \"69a2593c-e0f5-4e0b-9406-a96a3802c7cb\") " Mar 19 11:53:50.563789 master-0 kubenswrapper[6932]: I0319 11:53:50.562407 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "69a2593c-e0f5-4e0b-9406-a96a3802c7cb" (UID: "69a2593c-e0f5-4e0b-9406-a96a3802c7cb"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:53:50.563789 master-0 kubenswrapper[6932]: I0319 11:53:50.562890 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "69a2593c-e0f5-4e0b-9406-a96a3802c7cb" (UID: "69a2593c-e0f5-4e0b-9406-a96a3802c7cb"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:53:50.563789 master-0 kubenswrapper[6932]: I0319 11:53:50.563061 6932 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:50.563789 master-0 kubenswrapper[6932]: I0319 11:53:50.563086 6932 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:50.563789 master-0 kubenswrapper[6932]: I0319 11:53:50.563100 6932 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:50.563789 master-0 kubenswrapper[6932]: I0319 11:53:50.563115 6932 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:50.565679 master-0 kubenswrapper[6932]: I0319 11:53:50.565642 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "69a2593c-e0f5-4e0b-9406-a96a3802c7cb" (UID: "69a2593c-e0f5-4e0b-9406-a96a3802c7cb"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:53:50.566259 master-0 kubenswrapper[6932]: I0319 11:53:50.566212 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-kube-api-access-dt4fx" (OuterVolumeSpecName: "kube-api-access-dt4fx") pod "69a2593c-e0f5-4e0b-9406-a96a3802c7cb" (UID: "69a2593c-e0f5-4e0b-9406-a96a3802c7cb"). InnerVolumeSpecName "kube-api-access-dt4fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:53:50.566528 master-0 kubenswrapper[6932]: I0319 11:53:50.566490 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "69a2593c-e0f5-4e0b-9406-a96a3802c7cb" (UID: "69a2593c-e0f5-4e0b-9406-a96a3802c7cb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:53:50.568086 master-0 kubenswrapper[6932]: I0319 11:53:50.567959 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "69a2593c-e0f5-4e0b-9406-a96a3802c7cb" (UID: "69a2593c-e0f5-4e0b-9406-a96a3802c7cb"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:53:50.672781 master-0 kubenswrapper[6932]: I0319 11:53:50.672716 6932 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-etcd-client\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:50.672781 master-0 kubenswrapper[6932]: I0319 11:53:50.672780 6932 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-encryption-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:50.672781 master-0 kubenswrapper[6932]: I0319 11:53:50.672791 6932 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:50.673062 master-0 kubenswrapper[6932]: I0319 11:53:50.672801 6932 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dt4fx\" (UniqueName: \"kubernetes.io/projected/69a2593c-e0f5-4e0b-9406-a96a3802c7cb-kube-api-access-dt4fx\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:50.673062 master-0 kubenswrapper[6932]: I0319 11:53:50.672915 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:53:51.224852 master-0 kubenswrapper[6932]: I0319 11:53:51.224779 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" event={"ID":"69a2593c-e0f5-4e0b-9406-a96a3802c7cb","Type":"ContainerDied","Data":"0aa98b9120c0e97528b4d8961a0319c140af8ee3a5132f8e3a549ab86c741b48"} Mar 19 11:53:51.224852 master-0 kubenswrapper[6932]: I0319 11:53:51.224843 6932 scope.go:117] "RemoveContainer" containerID="9778989e23f8e62b1c5af9186416eb8c7c3c13efba2cab28ce54c71fc9ada2c1" Mar 19 11:53:51.225391 master-0 kubenswrapper[6932]: I0319 11:53:51.224958 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-5547669f67-9ltgx" Mar 19 11:53:51.230232 master-0 kubenswrapper[6932]: I0319 11:53:51.230177 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ztgjs" event={"ID":"1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c","Type":"ContainerStarted","Data":"69030d289a6bd635d03d6c28333100bb577b4d8cdfced8357d61376b39bed54b"} Mar 19 11:53:51.230389 master-0 kubenswrapper[6932]: I0319 11:53:51.230347 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-ztgjs" Mar 19 11:53:51.231636 master-0 kubenswrapper[6932]: I0319 11:53:51.231597 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv" event={"ID":"57761fc9-b8d5-4dd5-8882-9f33ef79111a","Type":"ContainerStarted","Data":"71125a65a7cc09af08b5fe4545a16ebf2099cb6cf96c5de1eb791846f48b9224"} Mar 19 11:53:51.234587 master-0 kubenswrapper[6932]: I0319 11:53:51.233978 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-f67f6868b-chx8j" event={"ID":"e48b5aa9-293e-4222-91ff-7640addeca4c","Type":"ContainerStarted","Data":"85ffcd3f2b1b89b0cbed5461d0a6f23d76d1183f3b636a9ba10e8ef8e32f743d"} Mar 19 11:53:51.234587 master-0 kubenswrapper[6932]: I0319 11:53:51.234004 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-f67f6868b-chx8j" event={"ID":"e48b5aa9-293e-4222-91ff-7640addeca4c","Type":"ContainerStarted","Data":"62daf0f940e6b4d8435c2710c08ed980b8190c1907c369c73eca812d9aad7d45"} Mar 19 11:53:52.372771 master-0 kubenswrapper[6932]: I0319 11:53:52.371874 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 19 11:53:52.386162 master-0 kubenswrapper[6932]: W0319 11:53:52.386086 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8e508a43_99db_49eb_bf4e_e3e6a0f49761.slice/crio-9f8ce6740e029884a042c5c80751b9fc216004e3a3d167dbdc2e5cb2a86f8183 WatchSource:0}: Error finding container 9f8ce6740e029884a042c5c80751b9fc216004e3a3d167dbdc2e5cb2a86f8183: Status 404 returned error can't find the container with id 9f8ce6740e029884a042c5c80751b9fc216004e3a3d167dbdc2e5cb2a86f8183 Mar 19 11:53:52.559053 master-0 kubenswrapper[6932]: I0319 11:53:52.558911 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ztgjs" podStartSLOduration=4.138150408 podStartE2EDuration="9.558889988s" podCreationTimestamp="2026-03-19 11:53:43 +0000 UTC" firstStartedPulling="2026-03-19 11:53:44.332511805 +0000 UTC m=+48.691572017" lastFinishedPulling="2026-03-19 11:53:49.753251375 +0000 UTC m=+54.112311597" observedRunningTime="2026-03-19 11:53:52.517331733 +0000 UTC m=+56.876391975" watchObservedRunningTime="2026-03-19 11:53:52.558889988 +0000 UTC m=+56.917950210" Mar 19 11:53:52.591437 master-0 kubenswrapper[6932]: I0319 11:53:52.591381 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r"] Mar 19 11:53:52.591997 master-0 kubenswrapper[6932]: E0319 11:53:52.591972 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69a2593c-e0f5-4e0b-9406-a96a3802c7cb" containerName="fix-audit-permissions" Mar 19 11:53:52.591997 master-0 kubenswrapper[6932]: I0319 11:53:52.591998 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a2593c-e0f5-4e0b-9406-a96a3802c7cb" containerName="fix-audit-permissions" Mar 19 11:53:52.592237 master-0 kubenswrapper[6932]: I0319 11:53:52.592218 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="69a2593c-e0f5-4e0b-9406-a96a3802c7cb" containerName="fix-audit-permissions" Mar 19 11:53:52.594127 master-0 kubenswrapper[6932]: I0319 11:53:52.594103 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-oauth-apiserver/apiserver-5547669f67-9ltgx"] Mar 19 11:53:52.604020 master-0 kubenswrapper[6932]: I0319 11:53:52.594311 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:52.604020 master-0 kubenswrapper[6932]: I0319 11:53:52.602421 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 11:53:52.614547 master-0 kubenswrapper[6932]: I0319 11:53:52.614490 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 11:53:52.614789 master-0 kubenswrapper[6932]: I0319 11:53:52.614608 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 11:53:52.614789 master-0 kubenswrapper[6932]: I0319 11:53:52.612104 6932 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-oauth-apiserver/apiserver-5547669f67-9ltgx"] Mar 19 11:53:52.630353 master-0 kubenswrapper[6932]: I0319 11:53:52.630228 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 11:53:52.630527 master-0 kubenswrapper[6932]: I0319 11:53:52.630486 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 11:53:52.630527 master-0 kubenswrapper[6932]: I0319 11:53:52.630490 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 11:53:52.630629 master-0 kubenswrapper[6932]: I0319 11:53:52.630604 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 11:53:52.630794 master-0 kubenswrapper[6932]: I0319 11:53:52.630757 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r"] Mar 19 11:53:52.630899 master-0 kubenswrapper[6932]: I0319 11:53:52.630877 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 11:53:52.674224 master-0 kubenswrapper[6932]: I0319 11:53:52.674111 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-f67f6868b-chx8j" podStartSLOduration=9.16614637 podStartE2EDuration="16.674091208s" podCreationTimestamp="2026-03-19 11:53:36 +0000 UTC" firstStartedPulling="2026-03-19 11:53:42.245457421 +0000 UTC m=+46.604517643" lastFinishedPulling="2026-03-19 11:53:49.753402259 +0000 UTC m=+54.112462481" observedRunningTime="2026-03-19 11:53:52.672531613 +0000 UTC m=+57.031591845" watchObservedRunningTime="2026-03-19 11:53:52.674091208 +0000 UTC m=+57.033151430" Mar 19 11:53:52.705127 master-0 kubenswrapper[6932]: I0319 11:53:52.705066 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e45616db-f7dd-4a08-847f-abf2759d9fa4-etcd-serving-ca\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:52.705325 master-0 kubenswrapper[6932]: I0319 11:53:52.705167 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e45616db-f7dd-4a08-847f-abf2759d9fa4-serving-cert\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:52.705325 master-0 kubenswrapper[6932]: I0319 11:53:52.705219 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvkxx\" (UniqueName: \"kubernetes.io/projected/e45616db-f7dd-4a08-847f-abf2759d9fa4-kube-api-access-dvkxx\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:52.705325 master-0 kubenswrapper[6932]: I0319 11:53:52.705251 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e45616db-f7dd-4a08-847f-abf2759d9fa4-audit-policies\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:52.705325 master-0 kubenswrapper[6932]: I0319 11:53:52.705268 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e45616db-f7dd-4a08-847f-abf2759d9fa4-etcd-client\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:52.705325 master-0 kubenswrapper[6932]: I0319 11:53:52.705307 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e45616db-f7dd-4a08-847f-abf2759d9fa4-audit-dir\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:52.705537 master-0 kubenswrapper[6932]: I0319 11:53:52.705390 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e45616db-f7dd-4a08-847f-abf2759d9fa4-encryption-config\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:52.705579 master-0 kubenswrapper[6932]: I0319 11:53:52.705522 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e45616db-f7dd-4a08-847f-abf2759d9fa4-trusted-ca-bundle\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:52.732914 master-0 kubenswrapper[6932]: I0319 11:53:52.732842 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:53:52.806820 master-0 kubenswrapper[6932]: I0319 11:53:52.806769 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e45616db-f7dd-4a08-847f-abf2759d9fa4-serving-cert\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:52.807031 master-0 kubenswrapper[6932]: I0319 11:53:52.806832 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvkxx\" (UniqueName: \"kubernetes.io/projected/e45616db-f7dd-4a08-847f-abf2759d9fa4-kube-api-access-dvkxx\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:52.807126 master-0 kubenswrapper[6932]: I0319 11:53:52.807032 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e45616db-f7dd-4a08-847f-abf2759d9fa4-audit-policies\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:52.807185 master-0 kubenswrapper[6932]: I0319 11:53:52.807151 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e45616db-f7dd-4a08-847f-abf2759d9fa4-etcd-client\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:52.807588 master-0 kubenswrapper[6932]: I0319 11:53:52.807567 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e45616db-f7dd-4a08-847f-abf2759d9fa4-audit-dir\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:52.807711 master-0 kubenswrapper[6932]: I0319 11:53:52.807692 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e45616db-f7dd-4a08-847f-abf2759d9fa4-encryption-config\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:52.807827 master-0 kubenswrapper[6932]: I0319 11:53:52.807712 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e45616db-f7dd-4a08-847f-abf2759d9fa4-audit-policies\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:52.807827 master-0 kubenswrapper[6932]: I0319 11:53:52.807779 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e45616db-f7dd-4a08-847f-abf2759d9fa4-trusted-ca-bundle\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:52.807827 master-0 kubenswrapper[6932]: I0319 11:53:52.807688 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e45616db-f7dd-4a08-847f-abf2759d9fa4-audit-dir\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:52.807929 master-0 kubenswrapper[6932]: I0319 11:53:52.807837 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e45616db-f7dd-4a08-847f-abf2759d9fa4-etcd-serving-ca\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:52.808239 master-0 kubenswrapper[6932]: I0319 11:53:52.808217 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e45616db-f7dd-4a08-847f-abf2759d9fa4-trusted-ca-bundle\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:52.808425 master-0 kubenswrapper[6932]: I0319 11:53:52.808403 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e45616db-f7dd-4a08-847f-abf2759d9fa4-etcd-serving-ca\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:52.810462 master-0 kubenswrapper[6932]: I0319 11:53:52.810431 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e45616db-f7dd-4a08-847f-abf2759d9fa4-serving-cert\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:52.823931 master-0 kubenswrapper[6932]: I0319 11:53:52.820352 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e45616db-f7dd-4a08-847f-abf2759d9fa4-encryption-config\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:52.835464 master-0 kubenswrapper[6932]: I0319 11:53:52.835396 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e45616db-f7dd-4a08-847f-abf2759d9fa4-etcd-client\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:52.848636 master-0 kubenswrapper[6932]: I0319 11:53:52.848562 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvkxx\" (UniqueName: \"kubernetes.io/projected/e45616db-f7dd-4a08-847f-abf2759d9fa4-kube-api-access-dvkxx\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:52.918576 master-0 kubenswrapper[6932]: I0319 11:53:52.918314 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:52.918576 master-0 kubenswrapper[6932]: I0319 11:53:52.918418 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:52.963751 master-0 kubenswrapper[6932]: I0319 11:53:52.963653 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:53.398911 master-0 kubenswrapper[6932]: I0319 11:53:53.398841 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"8e508a43-99db-49eb-bf4e-e3e6a0f49761","Type":"ContainerStarted","Data":"300261e39c3fe1898b1aa4629252d5e05f336f7f74bdf1250eea81121a460d42"} Mar 19 11:53:53.398911 master-0 kubenswrapper[6932]: I0319 11:53:53.398895 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"8e508a43-99db-49eb-bf4e-e3e6a0f49761","Type":"ContainerStarted","Data":"9f8ce6740e029884a042c5c80751b9fc216004e3a3d167dbdc2e5cb2a86f8183"} Mar 19 11:53:54.172557 master-0 kubenswrapper[6932]: I0319 11:53:54.172495 6932 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69a2593c-e0f5-4e0b-9406-a96a3802c7cb" path="/var/lib/kubelet/pods/69a2593c-e0f5-4e0b-9406-a96a3802c7cb/volumes" Mar 19 11:53:54.330794 master-0 kubenswrapper[6932]: I0319 11:53:54.329821 6932 patch_prober.go:28] interesting pod/apiserver-f67f6868b-chx8j container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 19 11:53:54.330794 master-0 kubenswrapper[6932]: [+]log ok Mar 19 11:53:54.330794 master-0 kubenswrapper[6932]: [+]etcd ok Mar 19 11:53:54.330794 master-0 kubenswrapper[6932]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 19 11:53:54.330794 master-0 kubenswrapper[6932]: [+]poststarthook/generic-apiserver-start-informers ok Mar 19 11:53:54.330794 master-0 kubenswrapper[6932]: [+]poststarthook/max-in-flight-filter ok Mar 19 11:53:54.330794 master-0 kubenswrapper[6932]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 19 11:53:54.330794 master-0 kubenswrapper[6932]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 19 11:53:54.330794 master-0 kubenswrapper[6932]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 19 11:53:54.330794 master-0 kubenswrapper[6932]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 19 11:53:54.330794 master-0 kubenswrapper[6932]: [+]poststarthook/project.openshift.io-projectcache ok Mar 19 11:53:54.330794 master-0 kubenswrapper[6932]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 19 11:53:54.330794 master-0 kubenswrapper[6932]: [+]poststarthook/openshift.io-startinformers ok Mar 19 11:53:54.330794 master-0 kubenswrapper[6932]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 19 11:53:54.330794 master-0 kubenswrapper[6932]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 19 11:53:54.330794 master-0 kubenswrapper[6932]: livez check failed Mar 19 11:53:54.330794 master-0 kubenswrapper[6932]: I0319 11:53:54.329991 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-f67f6868b-chx8j" podUID="e48b5aa9-293e-4222-91ff-7640addeca4c" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:53:54.331854 master-0 kubenswrapper[6932]: I0319 11:53:54.330826 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-master-0" podStartSLOduration=4.330801747 podStartE2EDuration="4.330801747s" podCreationTimestamp="2026-03-19 11:53:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:53:54.314359537 +0000 UTC m=+58.673419759" watchObservedRunningTime="2026-03-19 11:53:54.330801747 +0000 UTC m=+58.689861969" Mar 19 11:53:54.340690 master-0 kubenswrapper[6932]: I0319 11:53:54.339168 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r"] Mar 19 11:53:55.418504 master-0 kubenswrapper[6932]: I0319 11:53:55.418444 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" event={"ID":"e45616db-f7dd-4a08-847f-abf2759d9fa4","Type":"ContainerStarted","Data":"832f700980eab592f836b89a6aebe98be99148aa95ac29165addb6fccc6389c3"} Mar 19 11:53:56.425882 master-0 kubenswrapper[6932]: I0319 11:53:56.425820 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv" event={"ID":"57761fc9-b8d5-4dd5-8882-9f33ef79111a","Type":"ContainerStarted","Data":"aea4eb33404e362aaaa566e39148f877616979a0ae86df14444de3d19210ab9f"} Mar 19 11:53:56.427127 master-0 kubenswrapper[6932]: I0319 11:53:56.427099 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv" Mar 19 11:53:56.428929 master-0 kubenswrapper[6932]: I0319 11:53:56.427827 6932 generic.go:334] "Generic (PLEG): container finished" podID="e45616db-f7dd-4a08-847f-abf2759d9fa4" containerID="731a4a3fe4e18c18e36f2a5f5b232c45ff2da66d02993af6a4921783cc680289" exitCode=0 Mar 19 11:53:56.428929 master-0 kubenswrapper[6932]: I0319 11:53:56.427879 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" event={"ID":"e45616db-f7dd-4a08-847f-abf2759d9fa4","Type":"ContainerDied","Data":"731a4a3fe4e18c18e36f2a5f5b232c45ff2da66d02993af6a4921783cc680289"} Mar 19 11:53:56.438576 master-0 kubenswrapper[6932]: I0319 11:53:56.438214 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv" Mar 19 11:53:56.497893 master-0 kubenswrapper[6932]: I0319 11:53:56.497799 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv" podStartSLOduration=9.422318132000001 podStartE2EDuration="14.497715038s" podCreationTimestamp="2026-03-19 11:53:42 +0000 UTC" firstStartedPulling="2026-03-19 11:53:50.206077376 +0000 UTC m=+54.565137598" lastFinishedPulling="2026-03-19 11:53:55.281474282 +0000 UTC m=+59.640534504" observedRunningTime="2026-03-19 11:53:56.459683213 +0000 UTC m=+60.818743435" watchObservedRunningTime="2026-03-19 11:53:56.497715038 +0000 UTC m=+60.856775260" Mar 19 11:53:57.028850 master-0 kubenswrapper[6932]: I0319 11:53:57.028705 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 19 11:53:57.030215 master-0 kubenswrapper[6932]: I0319 11:53:57.030194 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:53:57.032278 master-0 kubenswrapper[6932]: I0319 11:53:57.031987 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 11:53:57.040307 master-0 kubenswrapper[6932]: I0319 11:53:57.040279 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 19 11:53:57.214627 master-0 kubenswrapper[6932]: I0319 11:53:57.214584 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c385dd73-4a25-4827-9c8f-d923afc782b7-var-lock\") pod \"installer-1-master-0\" (UID: \"c385dd73-4a25-4827-9c8f-d923afc782b7\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:53:57.214963 master-0 kubenswrapper[6932]: I0319 11:53:57.214947 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c385dd73-4a25-4827-9c8f-d923afc782b7-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"c385dd73-4a25-4827-9c8f-d923afc782b7\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:53:57.215092 master-0 kubenswrapper[6932]: I0319 11:53:57.215076 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c385dd73-4a25-4827-9c8f-d923afc782b7-kube-api-access\") pod \"installer-1-master-0\" (UID: \"c385dd73-4a25-4827-9c8f-d923afc782b7\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:53:57.316899 master-0 kubenswrapper[6932]: I0319 11:53:57.316766 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c385dd73-4a25-4827-9c8f-d923afc782b7-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"c385dd73-4a25-4827-9c8f-d923afc782b7\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:53:57.317189 master-0 kubenswrapper[6932]: I0319 11:53:57.317142 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c385dd73-4a25-4827-9c8f-d923afc782b7-kube-api-access\") pod \"installer-1-master-0\" (UID: \"c385dd73-4a25-4827-9c8f-d923afc782b7\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:53:57.317367 master-0 kubenswrapper[6932]: I0319 11:53:57.316931 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c385dd73-4a25-4827-9c8f-d923afc782b7-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"c385dd73-4a25-4827-9c8f-d923afc782b7\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:53:57.317539 master-0 kubenswrapper[6932]: I0319 11:53:57.317521 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c385dd73-4a25-4827-9c8f-d923afc782b7-var-lock\") pod \"installer-1-master-0\" (UID: \"c385dd73-4a25-4827-9c8f-d923afc782b7\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:53:57.317706 master-0 kubenswrapper[6932]: I0319 11:53:57.317571 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c385dd73-4a25-4827-9c8f-d923afc782b7-var-lock\") pod \"installer-1-master-0\" (UID: \"c385dd73-4a25-4827-9c8f-d923afc782b7\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:53:57.339289 master-0 kubenswrapper[6932]: I0319 11:53:57.339237 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c385dd73-4a25-4827-9c8f-d923afc782b7-kube-api-access\") pod \"installer-1-master-0\" (UID: \"c385dd73-4a25-4827-9c8f-d923afc782b7\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:53:57.352457 master-0 kubenswrapper[6932]: I0319 11:53:57.352396 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:53:57.437764 master-0 kubenswrapper[6932]: I0319 11:53:57.437244 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" event={"ID":"e45616db-f7dd-4a08-847f-abf2759d9fa4","Type":"ContainerStarted","Data":"d3e3d057114af0c8c91a644fb741d99d0b64b8da8e104cdedc13a01897164ddd"} Mar 19 11:53:57.492077 master-0 kubenswrapper[6932]: I0319 11:53:57.491275 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" podStartSLOduration=20.491259966 podStartE2EDuration="20.491259966s" podCreationTimestamp="2026-03-19 11:53:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:53:57.489290062 +0000 UTC m=+61.848350284" watchObservedRunningTime="2026-03-19 11:53:57.491259966 +0000 UTC m=+61.850320188" Mar 19 11:53:57.810485 master-0 kubenswrapper[6932]: I0319 11:53:57.810436 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 19 11:53:57.823923 master-0 kubenswrapper[6932]: W0319 11:53:57.823892 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc385dd73_4a25_4827_9c8f_d923afc782b7.slice/crio-3cb65febed88a73558fad96ec47a3c975afac21bc24c0f69a6eaebb5e72b8a31 WatchSource:0}: Error finding container 3cb65febed88a73558fad96ec47a3c975afac21bc24c0f69a6eaebb5e72b8a31: Status 404 returned error can't find the container with id 3cb65febed88a73558fad96ec47a3c975afac21bc24c0f69a6eaebb5e72b8a31 Mar 19 11:53:57.923174 master-0 kubenswrapper[6932]: I0319 11:53:57.915587 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:57.935699 master-0 kubenswrapper[6932]: I0319 11:53:57.935658 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:53:57.964484 master-0 kubenswrapper[6932]: I0319 11:53:57.964445 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:57.964771 master-0 kubenswrapper[6932]: I0319 11:53:57.964755 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:57.977038 master-0 kubenswrapper[6932]: I0319 11:53:57.976367 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:58.420531 master-0 kubenswrapper[6932]: I0319 11:53:58.420391 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-56d8475767-pk574"] Mar 19 11:53:58.420774 master-0 kubenswrapper[6932]: I0319 11:53:58.420609 6932 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" podUID="333047c4-aeca-410e-9393-ca4e74366921" containerName="cluster-version-operator" containerID="cri-o://785e2ba1caabc182ccb6e16b83cef1466b977a7b27b48ca8d4a2e38344896d2c" gracePeriod=130 Mar 19 11:53:58.450763 master-0 kubenswrapper[6932]: I0319 11:53:58.450306 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"c385dd73-4a25-4827-9c8f-d923afc782b7","Type":"ContainerStarted","Data":"168c2214cdbfaaf9c363e282042065f817a6654cada35b4a09dd8914621dd3a3"} Mar 19 11:53:58.450763 master-0 kubenswrapper[6932]: I0319 11:53:58.450368 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"c385dd73-4a25-4827-9c8f-d923afc782b7","Type":"ContainerStarted","Data":"3cb65febed88a73558fad96ec47a3c975afac21bc24c0f69a6eaebb5e72b8a31"} Mar 19 11:53:58.463778 master-0 kubenswrapper[6932]: I0319 11:53:58.460963 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:53:58.479759 master-0 kubenswrapper[6932]: I0319 11:53:58.479267 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-1-master-0" podStartSLOduration=1.479250091 podStartE2EDuration="1.479250091s" podCreationTimestamp="2026-03-19 11:53:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:53:58.479087187 +0000 UTC m=+62.838147419" watchObservedRunningTime="2026-03-19 11:53:58.479250091 +0000 UTC m=+62.838310303" Mar 19 11:53:59.197198 master-0 kubenswrapper[6932]: I0319 11:53:59.197162 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:53:59.301476 master-0 kubenswrapper[6932]: I0319 11:53:59.301403 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 19 11:53:59.301799 master-0 kubenswrapper[6932]: I0319 11:53:59.301689 6932 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-2-master-0" podUID="fe31583b-cf5c-47f4-9cd3-bb4964baae6e" containerName="installer" containerID="cri-o://80609c53dc5ef9b4161df1ba3ad8361d008842c19903c035bffbf987b38faf4e" gracePeriod=30 Mar 19 11:53:59.350827 master-0 kubenswrapper[6932]: I0319 11:53:59.350441 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/333047c4-aeca-410e-9393-ca4e74366921-etc-cvo-updatepayloads\") pod \"333047c4-aeca-410e-9393-ca4e74366921\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " Mar 19 11:53:59.350827 master-0 kubenswrapper[6932]: I0319 11:53:59.350504 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/333047c4-aeca-410e-9393-ca4e74366921-service-ca\") pod \"333047c4-aeca-410e-9393-ca4e74366921\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " Mar 19 11:53:59.350827 master-0 kubenswrapper[6932]: I0319 11:53:59.350549 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert\") pod \"333047c4-aeca-410e-9393-ca4e74366921\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " Mar 19 11:53:59.350827 master-0 kubenswrapper[6932]: I0319 11:53:59.350598 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/333047c4-aeca-410e-9393-ca4e74366921-kube-api-access\") pod \"333047c4-aeca-410e-9393-ca4e74366921\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " Mar 19 11:53:59.350827 master-0 kubenswrapper[6932]: I0319 11:53:59.350653 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/333047c4-aeca-410e-9393-ca4e74366921-etc-ssl-certs\") pod \"333047c4-aeca-410e-9393-ca4e74366921\" (UID: \"333047c4-aeca-410e-9393-ca4e74366921\") " Mar 19 11:53:59.351205 master-0 kubenswrapper[6932]: I0319 11:53:59.350894 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/333047c4-aeca-410e-9393-ca4e74366921-etc-ssl-certs" (OuterVolumeSpecName: "etc-ssl-certs") pod "333047c4-aeca-410e-9393-ca4e74366921" (UID: "333047c4-aeca-410e-9393-ca4e74366921"). InnerVolumeSpecName "etc-ssl-certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:53:59.351360 master-0 kubenswrapper[6932]: I0319 11:53:59.351302 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/333047c4-aeca-410e-9393-ca4e74366921-service-ca" (OuterVolumeSpecName: "service-ca") pod "333047c4-aeca-410e-9393-ca4e74366921" (UID: "333047c4-aeca-410e-9393-ca4e74366921"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:53:59.351360 master-0 kubenswrapper[6932]: I0319 11:53:59.351316 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/333047c4-aeca-410e-9393-ca4e74366921-etc-cvo-updatepayloads" (OuterVolumeSpecName: "etc-cvo-updatepayloads") pod "333047c4-aeca-410e-9393-ca4e74366921" (UID: "333047c4-aeca-410e-9393-ca4e74366921"). InnerVolumeSpecName "etc-cvo-updatepayloads". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:53:59.351486 master-0 kubenswrapper[6932]: I0319 11:53:59.351364 6932 reconciler_common.go:293] "Volume detached for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/333047c4-aeca-410e-9393-ca4e74366921-etc-ssl-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:59.354211 master-0 kubenswrapper[6932]: I0319 11:53:59.354156 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "333047c4-aeca-410e-9393-ca4e74366921" (UID: "333047c4-aeca-410e-9393-ca4e74366921"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:53:59.357190 master-0 kubenswrapper[6932]: I0319 11:53:59.357105 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/333047c4-aeca-410e-9393-ca4e74366921-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "333047c4-aeca-410e-9393-ca4e74366921" (UID: "333047c4-aeca-410e-9393-ca4e74366921"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:53:59.452371 master-0 kubenswrapper[6932]: I0319 11:53:59.452150 6932 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/333047c4-aeca-410e-9393-ca4e74366921-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:59.452371 master-0 kubenswrapper[6932]: I0319 11:53:59.452181 6932 reconciler_common.go:293] "Volume detached for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/333047c4-aeca-410e-9393-ca4e74366921-etc-cvo-updatepayloads\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:59.452371 master-0 kubenswrapper[6932]: I0319 11:53:59.452192 6932 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/333047c4-aeca-410e-9393-ca4e74366921-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:59.452371 master-0 kubenswrapper[6932]: I0319 11:53:59.452201 6932 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/333047c4-aeca-410e-9393-ca4e74366921-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:59.456135 master-0 kubenswrapper[6932]: I0319 11:53:59.456086 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_fe31583b-cf5c-47f4-9cd3-bb4964baae6e/installer/0.log" Mar 19 11:53:59.456288 master-0 kubenswrapper[6932]: I0319 11:53:59.456158 6932 generic.go:334] "Generic (PLEG): container finished" podID="fe31583b-cf5c-47f4-9cd3-bb4964baae6e" containerID="80609c53dc5ef9b4161df1ba3ad8361d008842c19903c035bffbf987b38faf4e" exitCode=1 Mar 19 11:53:59.456288 master-0 kubenswrapper[6932]: I0319 11:53:59.456263 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"fe31583b-cf5c-47f4-9cd3-bb4964baae6e","Type":"ContainerDied","Data":"80609c53dc5ef9b4161df1ba3ad8361d008842c19903c035bffbf987b38faf4e"} Mar 19 11:53:59.457386 master-0 kubenswrapper[6932]: I0319 11:53:59.457345 6932 generic.go:334] "Generic (PLEG): container finished" podID="333047c4-aeca-410e-9393-ca4e74366921" containerID="785e2ba1caabc182ccb6e16b83cef1466b977a7b27b48ca8d4a2e38344896d2c" exitCode=0 Mar 19 11:53:59.457461 master-0 kubenswrapper[6932]: I0319 11:53:59.457395 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" Mar 19 11:53:59.457694 master-0 kubenswrapper[6932]: I0319 11:53:59.457383 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" event={"ID":"333047c4-aeca-410e-9393-ca4e74366921","Type":"ContainerDied","Data":"785e2ba1caabc182ccb6e16b83cef1466b977a7b27b48ca8d4a2e38344896d2c"} Mar 19 11:53:59.457694 master-0 kubenswrapper[6932]: I0319 11:53:59.457691 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-56d8475767-pk574" event={"ID":"333047c4-aeca-410e-9393-ca4e74366921","Type":"ContainerDied","Data":"51659f06b28a4c4f2cd28005c52835b309a9cf7c78a54c2ff2f7be93e57a3eb3"} Mar 19 11:53:59.457825 master-0 kubenswrapper[6932]: I0319 11:53:59.457714 6932 scope.go:117] "RemoveContainer" containerID="785e2ba1caabc182ccb6e16b83cef1466b977a7b27b48ca8d4a2e38344896d2c" Mar 19 11:53:59.472738 master-0 kubenswrapper[6932]: I0319 11:53:59.472676 6932 scope.go:117] "RemoveContainer" containerID="785e2ba1caabc182ccb6e16b83cef1466b977a7b27b48ca8d4a2e38344896d2c" Mar 19 11:53:59.473265 master-0 kubenswrapper[6932]: E0319 11:53:59.473221 6932 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"785e2ba1caabc182ccb6e16b83cef1466b977a7b27b48ca8d4a2e38344896d2c\": container with ID starting with 785e2ba1caabc182ccb6e16b83cef1466b977a7b27b48ca8d4a2e38344896d2c not found: ID does not exist" containerID="785e2ba1caabc182ccb6e16b83cef1466b977a7b27b48ca8d4a2e38344896d2c" Mar 19 11:53:59.473320 master-0 kubenswrapper[6932]: I0319 11:53:59.473265 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"785e2ba1caabc182ccb6e16b83cef1466b977a7b27b48ca8d4a2e38344896d2c"} err="failed to get container status \"785e2ba1caabc182ccb6e16b83cef1466b977a7b27b48ca8d4a2e38344896d2c\": rpc error: code = NotFound desc = could not find container \"785e2ba1caabc182ccb6e16b83cef1466b977a7b27b48ca8d4a2e38344896d2c\": container with ID starting with 785e2ba1caabc182ccb6e16b83cef1466b977a7b27b48ca8d4a2e38344896d2c not found: ID does not exist" Mar 19 11:53:59.491663 master-0 kubenswrapper[6932]: I0319 11:53:59.491603 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-56d8475767-pk574"] Mar 19 11:53:59.505752 master-0 kubenswrapper[6932]: I0319 11:53:59.497935 6932 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-version/cluster-version-operator-56d8475767-pk574"] Mar 19 11:53:59.573185 master-0 kubenswrapper[6932]: I0319 11:53:59.572178 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8"] Mar 19 11:53:59.573436 master-0 kubenswrapper[6932]: E0319 11:53:59.573407 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="333047c4-aeca-410e-9393-ca4e74366921" containerName="cluster-version-operator" Mar 19 11:53:59.573489 master-0 kubenswrapper[6932]: I0319 11:53:59.573447 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="333047c4-aeca-410e-9393-ca4e74366921" containerName="cluster-version-operator" Mar 19 11:53:59.573578 master-0 kubenswrapper[6932]: I0319 11:53:59.573551 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="333047c4-aeca-410e-9393-ca4e74366921" containerName="cluster-version-operator" Mar 19 11:53:59.574273 master-0 kubenswrapper[6932]: I0319 11:53:59.574068 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:53:59.576004 master-0 kubenswrapper[6932]: I0319 11:53:59.575972 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 11:53:59.576992 master-0 kubenswrapper[6932]: I0319 11:53:59.576212 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 11:53:59.576992 master-0 kubenswrapper[6932]: I0319 11:53:59.576381 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 11:53:59.658887 master-0 kubenswrapper[6932]: I0319 11:53:59.655677 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dd6ec279-d92f-45c2-97c2-88b96fbd6600-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:53:59.658887 master-0 kubenswrapper[6932]: I0319 11:53:59.655791 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd6ec279-d92f-45c2-97c2-88b96fbd6600-service-ca\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:53:59.658887 master-0 kubenswrapper[6932]: I0319 11:53:59.655813 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dd6ec279-d92f-45c2-97c2-88b96fbd6600-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:53:59.658887 master-0 kubenswrapper[6932]: I0319 11:53:59.655840 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd6ec279-d92f-45c2-97c2-88b96fbd6600-serving-cert\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:53:59.658887 master-0 kubenswrapper[6932]: I0319 11:53:59.655874 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd6ec279-d92f-45c2-97c2-88b96fbd6600-kube-api-access\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:53:59.731443 master-0 kubenswrapper[6932]: I0319 11:53:59.731414 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_fe31583b-cf5c-47f4-9cd3-bb4964baae6e/installer/0.log" Mar 19 11:53:59.731690 master-0 kubenswrapper[6932]: I0319 11:53:59.731678 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:53:59.764040 master-0 kubenswrapper[6932]: I0319 11:53:59.756912 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd6ec279-d92f-45c2-97c2-88b96fbd6600-serving-cert\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:53:59.764040 master-0 kubenswrapper[6932]: I0319 11:53:59.757021 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd6ec279-d92f-45c2-97c2-88b96fbd6600-kube-api-access\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:53:59.764040 master-0 kubenswrapper[6932]: I0319 11:53:59.757076 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dd6ec279-d92f-45c2-97c2-88b96fbd6600-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:53:59.764040 master-0 kubenswrapper[6932]: I0319 11:53:59.757161 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd6ec279-d92f-45c2-97c2-88b96fbd6600-service-ca\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:53:59.764040 master-0 kubenswrapper[6932]: I0319 11:53:59.757184 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dd6ec279-d92f-45c2-97c2-88b96fbd6600-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:53:59.764040 master-0 kubenswrapper[6932]: I0319 11:53:59.757265 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dd6ec279-d92f-45c2-97c2-88b96fbd6600-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:53:59.764040 master-0 kubenswrapper[6932]: I0319 11:53:59.759676 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dd6ec279-d92f-45c2-97c2-88b96fbd6600-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:53:59.764040 master-0 kubenswrapper[6932]: I0319 11:53:59.760913 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd6ec279-d92f-45c2-97c2-88b96fbd6600-service-ca\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:53:59.764040 master-0 kubenswrapper[6932]: I0319 11:53:59.761546 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd6ec279-d92f-45c2-97c2-88b96fbd6600-serving-cert\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:53:59.776534 master-0 kubenswrapper[6932]: I0319 11:53:59.776158 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd6ec279-d92f-45c2-97c2-88b96fbd6600-kube-api-access\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:53:59.858645 master-0 kubenswrapper[6932]: I0319 11:53:59.858545 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe31583b-cf5c-47f4-9cd3-bb4964baae6e-kube-api-access\") pod \"fe31583b-cf5c-47f4-9cd3-bb4964baae6e\" (UID: \"fe31583b-cf5c-47f4-9cd3-bb4964baae6e\") " Mar 19 11:53:59.858645 master-0 kubenswrapper[6932]: I0319 11:53:59.858663 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fe31583b-cf5c-47f4-9cd3-bb4964baae6e-kubelet-dir\") pod \"fe31583b-cf5c-47f4-9cd3-bb4964baae6e\" (UID: \"fe31583b-cf5c-47f4-9cd3-bb4964baae6e\") " Mar 19 11:53:59.858979 master-0 kubenswrapper[6932]: I0319 11:53:59.858768 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fe31583b-cf5c-47f4-9cd3-bb4964baae6e-var-lock\") pod \"fe31583b-cf5c-47f4-9cd3-bb4964baae6e\" (UID: \"fe31583b-cf5c-47f4-9cd3-bb4964baae6e\") " Mar 19 11:53:59.859145 master-0 kubenswrapper[6932]: I0319 11:53:59.859106 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe31583b-cf5c-47f4-9cd3-bb4964baae6e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fe31583b-cf5c-47f4-9cd3-bb4964baae6e" (UID: "fe31583b-cf5c-47f4-9cd3-bb4964baae6e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:53:59.859224 master-0 kubenswrapper[6932]: I0319 11:53:59.859151 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe31583b-cf5c-47f4-9cd3-bb4964baae6e-var-lock" (OuterVolumeSpecName: "var-lock") pod "fe31583b-cf5c-47f4-9cd3-bb4964baae6e" (UID: "fe31583b-cf5c-47f4-9cd3-bb4964baae6e"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:53:59.861679 master-0 kubenswrapper[6932]: I0319 11:53:59.861627 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe31583b-cf5c-47f4-9cd3-bb4964baae6e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fe31583b-cf5c-47f4-9cd3-bb4964baae6e" (UID: "fe31583b-cf5c-47f4-9cd3-bb4964baae6e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:53:59.880168 master-0 kubenswrapper[6932]: I0319 11:53:59.880109 6932 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="333047c4-aeca-410e-9393-ca4e74366921" path="/var/lib/kubelet/pods/333047c4-aeca-410e-9393-ca4e74366921/volumes" Mar 19 11:53:59.903678 master-0 kubenswrapper[6932]: I0319 11:53:59.903630 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:53:59.920770 master-0 kubenswrapper[6932]: W0319 11:53:59.920692 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd6ec279_d92f_45c2_97c2_88b96fbd6600.slice/crio-12f1d4709c9e0d0ad1a233908194f29f84992ca4b99bba4692dddc9f4338c1ef WatchSource:0}: Error finding container 12f1d4709c9e0d0ad1a233908194f29f84992ca4b99bba4692dddc9f4338c1ef: Status 404 returned error can't find the container with id 12f1d4709c9e0d0ad1a233908194f29f84992ca4b99bba4692dddc9f4338c1ef Mar 19 11:53:59.960745 master-0 kubenswrapper[6932]: I0319 11:53:59.960018 6932 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fe31583b-cf5c-47f4-9cd3-bb4964baae6e-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:59.960745 master-0 kubenswrapper[6932]: I0319 11:53:59.960063 6932 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe31583b-cf5c-47f4-9cd3-bb4964baae6e-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:59.960745 master-0 kubenswrapper[6932]: I0319 11:53:59.960079 6932 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fe31583b-cf5c-47f4-9cd3-bb4964baae6e-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:00.466467 master-0 kubenswrapper[6932]: I0319 11:54:00.466418 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_fe31583b-cf5c-47f4-9cd3-bb4964baae6e/installer/0.log" Mar 19 11:54:00.467100 master-0 kubenswrapper[6932]: I0319 11:54:00.466506 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"fe31583b-cf5c-47f4-9cd3-bb4964baae6e","Type":"ContainerDied","Data":"1315fe63b992ebfde48753dbbe66d869fa11b59020e96d5bfd968d9ece99cc92"} Mar 19 11:54:00.467100 master-0 kubenswrapper[6932]: I0319 11:54:00.466551 6932 scope.go:117] "RemoveContainer" containerID="80609c53dc5ef9b4161df1ba3ad8361d008842c19903c035bffbf987b38faf4e" Mar 19 11:54:00.467100 master-0 kubenswrapper[6932]: I0319 11:54:00.466632 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:54:00.498266 master-0 kubenswrapper[6932]: I0319 11:54:00.498147 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" event={"ID":"dd6ec279-d92f-45c2-97c2-88b96fbd6600","Type":"ContainerStarted","Data":"f3ac0ae0109e7f763509e3a212b55f05f415fede7976cd16c8231233ecd95e88"} Mar 19 11:54:00.498533 master-0 kubenswrapper[6932]: I0319 11:54:00.498298 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" event={"ID":"dd6ec279-d92f-45c2-97c2-88b96fbd6600","Type":"ContainerStarted","Data":"12f1d4709c9e0d0ad1a233908194f29f84992ca4b99bba4692dddc9f4338c1ef"} Mar 19 11:54:00.512445 master-0 kubenswrapper[6932]: I0319 11:54:00.512380 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 19 11:54:00.515324 master-0 kubenswrapper[6932]: I0319 11:54:00.514641 6932 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 19 11:54:01.687900 master-0 kubenswrapper[6932]: I0319 11:54:01.687846 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-wdwkz\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:54:01.688413 master-0 kubenswrapper[6932]: I0319 11:54:01.687945 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:54:01.688413 master-0 kubenswrapper[6932]: I0319 11:54:01.687978 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:54:01.690986 master-0 kubenswrapper[6932]: I0319 11:54:01.690947 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-wdwkz\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:54:01.691440 master-0 kubenswrapper[6932]: I0319 11:54:01.691389 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:54:01.691835 master-0 kubenswrapper[6932]: I0319 11:54:01.691811 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:54:01.698362 master-0 kubenswrapper[6932]: I0319 11:54:01.698290 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" podStartSLOduration=2.698266907 podStartE2EDuration="2.698266907s" podCreationTimestamp="2026-03-19 11:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:54:00.531046553 +0000 UTC m=+64.890106775" watchObservedRunningTime="2026-03-19 11:54:01.698266907 +0000 UTC m=+66.057327129" Mar 19 11:54:01.700818 master-0 kubenswrapper[6932]: I0319 11:54:01.700779 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 19 11:54:01.701037 master-0 kubenswrapper[6932]: E0319 11:54:01.701009 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe31583b-cf5c-47f4-9cd3-bb4964baae6e" containerName="installer" Mar 19 11:54:01.701037 master-0 kubenswrapper[6932]: I0319 11:54:01.701034 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe31583b-cf5c-47f4-9cd3-bb4964baae6e" containerName="installer" Mar 19 11:54:01.701129 master-0 kubenswrapper[6932]: I0319 11:54:01.701114 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe31583b-cf5c-47f4-9cd3-bb4964baae6e" containerName="installer" Mar 19 11:54:01.701566 master-0 kubenswrapper[6932]: I0319 11:54:01.701538 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:54:01.711239 master-0 kubenswrapper[6932]: I0319 11:54:01.711180 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 19 11:54:01.789755 master-0 kubenswrapper[6932]: I0319 11:54:01.789662 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:54:01.789755 master-0 kubenswrapper[6932]: I0319 11:54:01.789757 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert\") pod \"catalog-operator-68f85b4d6c-n5gr9\" (UID: \"cf08ab4f-c203-4c16-9826-8cc049f4af31\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:54:01.790020 master-0 kubenswrapper[6932]: I0319 11:54:01.789818 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert\") pod \"olm-operator-5c9796789-l9sw9\" (UID: \"716c2176-50f9-4c4f-af0e-4c7973457df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:54:01.790020 master-0 kubenswrapper[6932]: I0319 11:54:01.789847 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-jq5vq\" (UID: \"e5078f17-bc65-460f-9f18-8c506db6840b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:54:01.790968 master-0 kubenswrapper[6932]: I0319 11:54:01.790900 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/870e66ff-82ed-4c91-8197-dddcb78048c2-kube-api-access\") pod \"installer-3-master-0\" (UID: \"870e66ff-82ed-4c91-8197-dddcb78048c2\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:54:01.791133 master-0 kubenswrapper[6932]: I0319 11:54:01.791098 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/870e66ff-82ed-4c91-8197-dddcb78048c2-var-lock\") pod \"installer-3-master-0\" (UID: \"870e66ff-82ed-4c91-8197-dddcb78048c2\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:54:01.791273 master-0 kubenswrapper[6932]: I0319 11:54:01.791186 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/870e66ff-82ed-4c91-8197-dddcb78048c2-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"870e66ff-82ed-4c91-8197-dddcb78048c2\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:54:01.793254 master-0 kubenswrapper[6932]: I0319 11:54:01.793214 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert\") pod \"catalog-operator-68f85b4d6c-n5gr9\" (UID: \"cf08ab4f-c203-4c16-9826-8cc049f4af31\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:54:01.794921 master-0 kubenswrapper[6932]: I0319 11:54:01.794823 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-jq5vq\" (UID: \"e5078f17-bc65-460f-9f18-8c506db6840b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:54:01.800452 master-0 kubenswrapper[6932]: I0319 11:54:01.800415 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:54:01.800585 master-0 kubenswrapper[6932]: I0319 11:54:01.800417 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert\") pod \"olm-operator-5c9796789-l9sw9\" (UID: \"716c2176-50f9-4c4f-af0e-4c7973457df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:54:01.828667 master-0 kubenswrapper[6932]: I0319 11:54:01.825805 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-654cf7f8-7lm6v"] Mar 19 11:54:01.828667 master-0 kubenswrapper[6932]: I0319 11:54:01.826012 6932 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" podUID="e7d33d5f-797c-4491-a1e3-1506452d2aff" containerName="controller-manager" containerID="cri-o://a702a8b2551a8cc2cd855896867355537b670e597276dfea5a1cc7a37a886b74" gracePeriod=30 Mar 19 11:54:01.852312 master-0 kubenswrapper[6932]: I0319 11:54:01.852246 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv"] Mar 19 11:54:01.852599 master-0 kubenswrapper[6932]: I0319 11:54:01.852479 6932 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv" podUID="57761fc9-b8d5-4dd5-8882-9f33ef79111a" containerName="route-controller-manager" containerID="cri-o://aea4eb33404e362aaaa566e39148f877616979a0ae86df14444de3d19210ab9f" gracePeriod=30 Mar 19 11:54:01.889399 master-0 kubenswrapper[6932]: I0319 11:54:01.889267 6932 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe31583b-cf5c-47f4-9cd3-bb4964baae6e" path="/var/lib/kubelet/pods/fe31583b-cf5c-47f4-9cd3-bb4964baae6e/volumes" Mar 19 11:54:01.893319 master-0 kubenswrapper[6932]: I0319 11:54:01.892645 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/870e66ff-82ed-4c91-8197-dddcb78048c2-kube-api-access\") pod \"installer-3-master-0\" (UID: \"870e66ff-82ed-4c91-8197-dddcb78048c2\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:54:01.893319 master-0 kubenswrapper[6932]: I0319 11:54:01.892713 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/870e66ff-82ed-4c91-8197-dddcb78048c2-var-lock\") pod \"installer-3-master-0\" (UID: \"870e66ff-82ed-4c91-8197-dddcb78048c2\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:54:01.893319 master-0 kubenswrapper[6932]: I0319 11:54:01.892752 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/870e66ff-82ed-4c91-8197-dddcb78048c2-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"870e66ff-82ed-4c91-8197-dddcb78048c2\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:54:01.893319 master-0 kubenswrapper[6932]: I0319 11:54:01.892907 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/870e66ff-82ed-4c91-8197-dddcb78048c2-var-lock\") pod \"installer-3-master-0\" (UID: \"870e66ff-82ed-4c91-8197-dddcb78048c2\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:54:01.893319 master-0 kubenswrapper[6932]: I0319 11:54:01.893077 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/870e66ff-82ed-4c91-8197-dddcb78048c2-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"870e66ff-82ed-4c91-8197-dddcb78048c2\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:54:01.915226 master-0 kubenswrapper[6932]: I0319 11:54:01.915031 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/870e66ff-82ed-4c91-8197-dddcb78048c2-kube-api-access\") pod \"installer-3-master-0\" (UID: \"870e66ff-82ed-4c91-8197-dddcb78048c2\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:54:01.925942 master-0 kubenswrapper[6932]: I0319 11:54:01.925848 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:54:01.925942 master-0 kubenswrapper[6932]: I0319 11:54:01.925882 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:54:01.926930 master-0 kubenswrapper[6932]: I0319 11:54:01.926841 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:54:01.942257 master-0 kubenswrapper[6932]: I0319 11:54:01.942195 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:54:01.942257 master-0 kubenswrapper[6932]: I0319 11:54:01.942244 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:54:01.942525 master-0 kubenswrapper[6932]: I0319 11:54:01.942385 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:54:01.943193 master-0 kubenswrapper[6932]: I0319 11:54:01.943035 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:54:02.013040 master-0 kubenswrapper[6932]: I0319 11:54:02.010254 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ztgjs" Mar 19 11:54:02.055919 master-0 kubenswrapper[6932]: I0319 11:54:02.055838 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:54:02.377505 master-0 kubenswrapper[6932]: I0319 11:54:02.377442 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9"] Mar 19 11:54:02.463357 master-0 kubenswrapper[6932]: I0319 11:54:02.462222 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv" Mar 19 11:54:02.468869 master-0 kubenswrapper[6932]: I0319 11:54:02.468827 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" Mar 19 11:54:02.523007 master-0 kubenswrapper[6932]: I0319 11:54:02.519362 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" event={"ID":"cf08ab4f-c203-4c16-9826-8cc049f4af31","Type":"ContainerStarted","Data":"e8bcebf454a79198b14303cf41946d1cf832021a30a2591e1b23c6740fca1e9b"} Mar 19 11:54:02.526850 master-0 kubenswrapper[6932]: I0319 11:54:02.525626 6932 generic.go:334] "Generic (PLEG): container finished" podID="57761fc9-b8d5-4dd5-8882-9f33ef79111a" containerID="aea4eb33404e362aaaa566e39148f877616979a0ae86df14444de3d19210ab9f" exitCode=0 Mar 19 11:54:02.526850 master-0 kubenswrapper[6932]: I0319 11:54:02.525705 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv" event={"ID":"57761fc9-b8d5-4dd5-8882-9f33ef79111a","Type":"ContainerDied","Data":"aea4eb33404e362aaaa566e39148f877616979a0ae86df14444de3d19210ab9f"} Mar 19 11:54:02.526850 master-0 kubenswrapper[6932]: I0319 11:54:02.525757 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv" event={"ID":"57761fc9-b8d5-4dd5-8882-9f33ef79111a","Type":"ContainerDied","Data":"71125a65a7cc09af08b5fe4545a16ebf2099cb6cf96c5de1eb791846f48b9224"} Mar 19 11:54:02.526850 master-0 kubenswrapper[6932]: I0319 11:54:02.525784 6932 scope.go:117] "RemoveContainer" containerID="aea4eb33404e362aaaa566e39148f877616979a0ae86df14444de3d19210ab9f" Mar 19 11:54:02.526850 master-0 kubenswrapper[6932]: I0319 11:54:02.525958 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv" Mar 19 11:54:02.552823 master-0 kubenswrapper[6932]: I0319 11:54:02.552432 6932 generic.go:334] "Generic (PLEG): container finished" podID="e7d33d5f-797c-4491-a1e3-1506452d2aff" containerID="a702a8b2551a8cc2cd855896867355537b670e597276dfea5a1cc7a37a886b74" exitCode=0 Mar 19 11:54:02.552823 master-0 kubenswrapper[6932]: I0319 11:54:02.552491 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" event={"ID":"e7d33d5f-797c-4491-a1e3-1506452d2aff","Type":"ContainerDied","Data":"a702a8b2551a8cc2cd855896867355537b670e597276dfea5a1cc7a37a886b74"} Mar 19 11:54:02.553473 master-0 kubenswrapper[6932]: I0319 11:54:02.552524 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" event={"ID":"e7d33d5f-797c-4491-a1e3-1506452d2aff","Type":"ContainerDied","Data":"17ef7d3c5afdf38a9291cfc7bd1abddb4ccc85cfe41ec061ac20ffbde675f248"} Mar 19 11:54:02.563986 master-0 kubenswrapper[6932]: I0319 11:54:02.557802 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-654cf7f8-7lm6v" Mar 19 11:54:02.600782 master-0 kubenswrapper[6932]: I0319 11:54:02.591045 6932 scope.go:117] "RemoveContainer" containerID="aea4eb33404e362aaaa566e39148f877616979a0ae86df14444de3d19210ab9f" Mar 19 11:54:02.608932 master-0 kubenswrapper[6932]: E0319 11:54:02.608883 6932 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aea4eb33404e362aaaa566e39148f877616979a0ae86df14444de3d19210ab9f\": container with ID starting with aea4eb33404e362aaaa566e39148f877616979a0ae86df14444de3d19210ab9f not found: ID does not exist" containerID="aea4eb33404e362aaaa566e39148f877616979a0ae86df14444de3d19210ab9f" Mar 19 11:54:02.609044 master-0 kubenswrapper[6932]: I0319 11:54:02.608931 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aea4eb33404e362aaaa566e39148f877616979a0ae86df14444de3d19210ab9f"} err="failed to get container status \"aea4eb33404e362aaaa566e39148f877616979a0ae86df14444de3d19210ab9f\": rpc error: code = NotFound desc = could not find container \"aea4eb33404e362aaaa566e39148f877616979a0ae86df14444de3d19210ab9f\": container with ID starting with aea4eb33404e362aaaa566e39148f877616979a0ae86df14444de3d19210ab9f not found: ID does not exist" Mar 19 11:54:02.609044 master-0 kubenswrapper[6932]: I0319 11:54:02.608960 6932 scope.go:117] "RemoveContainer" containerID="a702a8b2551a8cc2cd855896867355537b670e597276dfea5a1cc7a37a886b74" Mar 19 11:54:02.609132 master-0 kubenswrapper[6932]: I0319 11:54:02.609029 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57761fc9-b8d5-4dd5-8882-9f33ef79111a-client-ca\") pod \"57761fc9-b8d5-4dd5-8882-9f33ef79111a\" (UID: \"57761fc9-b8d5-4dd5-8882-9f33ef79111a\") " Mar 19 11:54:02.609132 master-0 kubenswrapper[6932]: I0319 11:54:02.609096 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57761fc9-b8d5-4dd5-8882-9f33ef79111a-serving-cert\") pod \"57761fc9-b8d5-4dd5-8882-9f33ef79111a\" (UID: \"57761fc9-b8d5-4dd5-8882-9f33ef79111a\") " Mar 19 11:54:02.609210 master-0 kubenswrapper[6932]: I0319 11:54:02.609129 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd5w8\" (UniqueName: \"kubernetes.io/projected/57761fc9-b8d5-4dd5-8882-9f33ef79111a-kube-api-access-vd5w8\") pod \"57761fc9-b8d5-4dd5-8882-9f33ef79111a\" (UID: \"57761fc9-b8d5-4dd5-8882-9f33ef79111a\") " Mar 19 11:54:02.609210 master-0 kubenswrapper[6932]: I0319 11:54:02.609172 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57761fc9-b8d5-4dd5-8882-9f33ef79111a-config\") pod \"57761fc9-b8d5-4dd5-8882-9f33ef79111a\" (UID: \"57761fc9-b8d5-4dd5-8882-9f33ef79111a\") " Mar 19 11:54:02.609279 master-0 kubenswrapper[6932]: I0319 11:54:02.609212 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7d33d5f-797c-4491-a1e3-1506452d2aff-proxy-ca-bundles\") pod \"e7d33d5f-797c-4491-a1e3-1506452d2aff\" (UID: \"e7d33d5f-797c-4491-a1e3-1506452d2aff\") " Mar 19 11:54:02.618297 master-0 kubenswrapper[6932]: I0319 11:54:02.618250 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7d33d5f-797c-4491-a1e3-1506452d2aff-client-ca\") pod \"e7d33d5f-797c-4491-a1e3-1506452d2aff\" (UID: \"e7d33d5f-797c-4491-a1e3-1506452d2aff\") " Mar 19 11:54:02.618428 master-0 kubenswrapper[6932]: I0319 11:54:02.618298 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7d33d5f-797c-4491-a1e3-1506452d2aff-serving-cert\") pod \"e7d33d5f-797c-4491-a1e3-1506452d2aff\" (UID: \"e7d33d5f-797c-4491-a1e3-1506452d2aff\") " Mar 19 11:54:02.618428 master-0 kubenswrapper[6932]: I0319 11:54:02.618383 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7d33d5f-797c-4491-a1e3-1506452d2aff-config\") pod \"e7d33d5f-797c-4491-a1e3-1506452d2aff\" (UID: \"e7d33d5f-797c-4491-a1e3-1506452d2aff\") " Mar 19 11:54:02.619069 master-0 kubenswrapper[6932]: I0319 11:54:02.618494 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d9dz\" (UniqueName: \"kubernetes.io/projected/e7d33d5f-797c-4491-a1e3-1506452d2aff-kube-api-access-8d9dz\") pod \"e7d33d5f-797c-4491-a1e3-1506452d2aff\" (UID: \"e7d33d5f-797c-4491-a1e3-1506452d2aff\") " Mar 19 11:54:02.621603 master-0 kubenswrapper[6932]: I0319 11:54:02.621548 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57761fc9-b8d5-4dd5-8882-9f33ef79111a-config" (OuterVolumeSpecName: "config") pod "57761fc9-b8d5-4dd5-8882-9f33ef79111a" (UID: "57761fc9-b8d5-4dd5-8882-9f33ef79111a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:02.622717 master-0 kubenswrapper[6932]: I0319 11:54:02.622243 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7d33d5f-797c-4491-a1e3-1506452d2aff-client-ca" (OuterVolumeSpecName: "client-ca") pod "e7d33d5f-797c-4491-a1e3-1506452d2aff" (UID: "e7d33d5f-797c-4491-a1e3-1506452d2aff"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:02.622717 master-0 kubenswrapper[6932]: I0319 11:54:02.622650 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7d33d5f-797c-4491-a1e3-1506452d2aff-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e7d33d5f-797c-4491-a1e3-1506452d2aff" (UID: "e7d33d5f-797c-4491-a1e3-1506452d2aff"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:02.623062 master-0 kubenswrapper[6932]: I0319 11:54:02.623025 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57761fc9-b8d5-4dd5-8882-9f33ef79111a-client-ca" (OuterVolumeSpecName: "client-ca") pod "57761fc9-b8d5-4dd5-8882-9f33ef79111a" (UID: "57761fc9-b8d5-4dd5-8882-9f33ef79111a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:02.624874 master-0 kubenswrapper[6932]: I0319 11:54:02.624834 6932 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57761fc9-b8d5-4dd5-8882-9f33ef79111a-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:02.624874 master-0 kubenswrapper[6932]: I0319 11:54:02.624867 6932 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57761fc9-b8d5-4dd5-8882-9f33ef79111a-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:02.624874 master-0 kubenswrapper[6932]: I0319 11:54:02.624878 6932 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7d33d5f-797c-4491-a1e3-1506452d2aff-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:02.625049 master-0 kubenswrapper[6932]: I0319 11:54:02.624888 6932 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7d33d5f-797c-4491-a1e3-1506452d2aff-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:02.630389 master-0 kubenswrapper[6932]: I0319 11:54:02.626133 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57761fc9-b8d5-4dd5-8882-9f33ef79111a-kube-api-access-vd5w8" (OuterVolumeSpecName: "kube-api-access-vd5w8") pod "57761fc9-b8d5-4dd5-8882-9f33ef79111a" (UID: "57761fc9-b8d5-4dd5-8882-9f33ef79111a"). InnerVolumeSpecName "kube-api-access-vd5w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:54:02.630389 master-0 kubenswrapper[6932]: I0319 11:54:02.626190 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57761fc9-b8d5-4dd5-8882-9f33ef79111a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "57761fc9-b8d5-4dd5-8882-9f33ef79111a" (UID: "57761fc9-b8d5-4dd5-8882-9f33ef79111a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:54:02.630389 master-0 kubenswrapper[6932]: I0319 11:54:02.630294 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7d33d5f-797c-4491-a1e3-1506452d2aff-config" (OuterVolumeSpecName: "config") pod "e7d33d5f-797c-4491-a1e3-1506452d2aff" (UID: "e7d33d5f-797c-4491-a1e3-1506452d2aff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:02.630389 master-0 kubenswrapper[6932]: I0319 11:54:02.630373 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh"] Mar 19 11:54:02.633945 master-0 kubenswrapper[6932]: I0319 11:54:02.631057 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7d33d5f-797c-4491-a1e3-1506452d2aff-kube-api-access-8d9dz" (OuterVolumeSpecName: "kube-api-access-8d9dz") pod "e7d33d5f-797c-4491-a1e3-1506452d2aff" (UID: "e7d33d5f-797c-4491-a1e3-1506452d2aff"). InnerVolumeSpecName "kube-api-access-8d9dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:54:02.639551 master-0 kubenswrapper[6932]: I0319 11:54:02.639488 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d33d5f-797c-4491-a1e3-1506452d2aff-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7d33d5f-797c-4491-a1e3-1506452d2aff" (UID: "e7d33d5f-797c-4491-a1e3-1506452d2aff"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:54:02.649167 master-0 kubenswrapper[6932]: I0319 11:54:02.649098 6932 scope.go:117] "RemoveContainer" containerID="a702a8b2551a8cc2cd855896867355537b670e597276dfea5a1cc7a37a886b74" Mar 19 11:54:02.649748 master-0 kubenswrapper[6932]: E0319 11:54:02.649630 6932 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a702a8b2551a8cc2cd855896867355537b670e597276dfea5a1cc7a37a886b74\": container with ID starting with a702a8b2551a8cc2cd855896867355537b670e597276dfea5a1cc7a37a886b74 not found: ID does not exist" containerID="a702a8b2551a8cc2cd855896867355537b670e597276dfea5a1cc7a37a886b74" Mar 19 11:54:02.649891 master-0 kubenswrapper[6932]: I0319 11:54:02.649705 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a702a8b2551a8cc2cd855896867355537b670e597276dfea5a1cc7a37a886b74"} err="failed to get container status \"a702a8b2551a8cc2cd855896867355537b670e597276dfea5a1cc7a37a886b74\": rpc error: code = NotFound desc = could not find container \"a702a8b2551a8cc2cd855896867355537b670e597276dfea5a1cc7a37a886b74\": container with ID starting with a702a8b2551a8cc2cd855896867355537b670e597276dfea5a1cc7a37a886b74 not found: ID does not exist" Mar 19 11:54:02.653463 master-0 kubenswrapper[6932]: I0319 11:54:02.653417 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq"] Mar 19 11:54:02.657929 master-0 kubenswrapper[6932]: I0319 11:54:02.657904 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9"] Mar 19 11:54:02.725754 master-0 kubenswrapper[6932]: I0319 11:54:02.725631 6932 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7d33d5f-797c-4491-a1e3-1506452d2aff-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:02.725754 master-0 kubenswrapper[6932]: I0319 11:54:02.725673 6932 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d9dz\" (UniqueName: \"kubernetes.io/projected/e7d33d5f-797c-4491-a1e3-1506452d2aff-kube-api-access-8d9dz\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:02.725754 master-0 kubenswrapper[6932]: I0319 11:54:02.725689 6932 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57761fc9-b8d5-4dd5-8882-9f33ef79111a-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:02.725754 master-0 kubenswrapper[6932]: I0319 11:54:02.725699 6932 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd5w8\" (UniqueName: \"kubernetes.io/projected/57761fc9-b8d5-4dd5-8882-9f33ef79111a-kube-api-access-vd5w8\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:02.725754 master-0 kubenswrapper[6932]: I0319 11:54:02.725709 6932 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7d33d5f-797c-4491-a1e3-1506452d2aff-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:02.802539 master-0 kubenswrapper[6932]: I0319 11:54:02.800837 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-f6wv7"] Mar 19 11:54:02.811580 master-0 kubenswrapper[6932]: W0319 11:54:02.810436 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf29b11ce_60e0_46b3_8d28_eea3452513cd.slice/crio-2c0b2d29cecf537e4921aa4396580e1259d6519819244de28dc54db9b3eeb9d0 WatchSource:0}: Error finding container 2c0b2d29cecf537e4921aa4396580e1259d6519819244de28dc54db9b3eeb9d0: Status 404 returned error can't find the container with id 2c0b2d29cecf537e4921aa4396580e1259d6519819244de28dc54db9b3eeb9d0 Mar 19 11:54:02.840151 master-0 kubenswrapper[6932]: I0319 11:54:02.840072 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-89ccd998f-bftt4"] Mar 19 11:54:02.846886 master-0 kubenswrapper[6932]: I0319 11:54:02.846843 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 19 11:54:02.859710 master-0 kubenswrapper[6932]: I0319 11:54:02.859654 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz"] Mar 19 11:54:02.865539 master-0 kubenswrapper[6932]: W0319 11:54:02.865502 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3de8a1b_a5be_414f_86e8_738e16c8bc97.slice/crio-e6908445d4f9d29994371a77f0165de1617d0b3d69f7e33acfc73003f26e2111 WatchSource:0}: Error finding container e6908445d4f9d29994371a77f0165de1617d0b3d69f7e33acfc73003f26e2111: Status 404 returned error can't find the container with id e6908445d4f9d29994371a77f0165de1617d0b3d69f7e33acfc73003f26e2111 Mar 19 11:54:02.866782 master-0 kubenswrapper[6932]: W0319 11:54:02.866722 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod870e66ff_82ed_4c91_8197_dddcb78048c2.slice/crio-e775e2237b3e13100c3e1ab188e2c83cffcee4d11a252841dd57b5b92e5e9841 WatchSource:0}: Error finding container e775e2237b3e13100c3e1ab188e2c83cffcee4d11a252841dd57b5b92e5e9841: Status 404 returned error can't find the container with id e775e2237b3e13100c3e1ab188e2c83cffcee4d11a252841dd57b5b92e5e9841 Mar 19 11:54:02.927174 master-0 kubenswrapper[6932]: I0319 11:54:02.927123 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv"] Mar 19 11:54:02.937285 master-0 kubenswrapper[6932]: I0319 11:54:02.937231 6932 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f6c859bf8-g2lgv"] Mar 19 11:54:02.949771 master-0 kubenswrapper[6932]: I0319 11:54:02.949517 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-654cf7f8-7lm6v"] Mar 19 11:54:02.952337 master-0 kubenswrapper[6932]: I0319 11:54:02.952304 6932 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-654cf7f8-7lm6v"] Mar 19 11:54:03.023321 master-0 kubenswrapper[6932]: I0319 11:54:03.021841 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-548bb99f44-txbjj"] Mar 19 11:54:03.023321 master-0 kubenswrapper[6932]: E0319 11:54:03.022038 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d33d5f-797c-4491-a1e3-1506452d2aff" containerName="controller-manager" Mar 19 11:54:03.023321 master-0 kubenswrapper[6932]: I0319 11:54:03.022050 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d33d5f-797c-4491-a1e3-1506452d2aff" containerName="controller-manager" Mar 19 11:54:03.023321 master-0 kubenswrapper[6932]: E0319 11:54:03.022060 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57761fc9-b8d5-4dd5-8882-9f33ef79111a" containerName="route-controller-manager" Mar 19 11:54:03.023321 master-0 kubenswrapper[6932]: I0319 11:54:03.022067 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="57761fc9-b8d5-4dd5-8882-9f33ef79111a" containerName="route-controller-manager" Mar 19 11:54:03.023321 master-0 kubenswrapper[6932]: I0319 11:54:03.022146 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="57761fc9-b8d5-4dd5-8882-9f33ef79111a" containerName="route-controller-manager" Mar 19 11:54:03.023321 master-0 kubenswrapper[6932]: I0319 11:54:03.022163 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d33d5f-797c-4491-a1e3-1506452d2aff" containerName="controller-manager" Mar 19 11:54:03.023321 master-0 kubenswrapper[6932]: I0319 11:54:03.022489 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:54:03.030338 master-0 kubenswrapper[6932]: I0319 11:54:03.026860 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 11:54:03.030992 master-0 kubenswrapper[6932]: I0319 11:54:03.030883 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 11:54:03.031306 master-0 kubenswrapper[6932]: I0319 11:54:03.031238 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 11:54:03.033818 master-0 kubenswrapper[6932]: I0319 11:54:03.033694 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd"] Mar 19 11:54:03.034255 master-0 kubenswrapper[6932]: I0319 11:54:03.033740 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 11:54:03.035320 master-0 kubenswrapper[6932]: I0319 11:54:03.035108 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:54:03.036493 master-0 kubenswrapper[6932]: I0319 11:54:03.036404 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 11:54:03.037608 master-0 kubenswrapper[6932]: I0319 11:54:03.037019 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 11:54:03.038199 master-0 kubenswrapper[6932]: I0319 11:54:03.037917 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 11:54:03.038691 master-0 kubenswrapper[6932]: I0319 11:54:03.038276 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 11:54:03.038691 master-0 kubenswrapper[6932]: I0319 11:54:03.038498 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 11:54:03.039152 master-0 kubenswrapper[6932]: I0319 11:54:03.038876 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 11:54:03.039152 master-0 kubenswrapper[6932]: I0319 11:54:03.039022 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-qsmbf" Mar 19 11:54:03.039398 master-0 kubenswrapper[6932]: I0319 11:54:03.039323 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 11:54:03.044757 master-0 kubenswrapper[6932]: I0319 11:54:03.044471 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-548bb99f44-txbjj"] Mar 19 11:54:03.044757 master-0 kubenswrapper[6932]: I0319 11:54:03.044516 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd"] Mar 19 11:54:03.134303 master-0 kubenswrapper[6932]: I0319 11:54:03.134264 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5skx\" (UniqueName: \"kubernetes.io/projected/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-kube-api-access-n5skx\") pod \"route-controller-manager-864f875b6b-rcjvd\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:54:03.134561 master-0 kubenswrapper[6932]: I0319 11:54:03.134544 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj527\" (UniqueName: \"kubernetes.io/projected/76cf2b01-33d9-47eb-be5d-44946c78bf20-kube-api-access-nj527\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:54:03.134683 master-0 kubenswrapper[6932]: I0319 11:54:03.134665 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-client-ca\") pod \"route-controller-manager-864f875b6b-rcjvd\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:54:03.134830 master-0 kubenswrapper[6932]: I0319 11:54:03.134814 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-proxy-ca-bundles\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:54:03.134910 master-0 kubenswrapper[6932]: I0319 11:54:03.134898 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-config\") pod \"route-controller-manager-864f875b6b-rcjvd\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:54:03.135031 master-0 kubenswrapper[6932]: I0319 11:54:03.135012 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76cf2b01-33d9-47eb-be5d-44946c78bf20-serving-cert\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:54:03.135136 master-0 kubenswrapper[6932]: I0319 11:54:03.135123 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-config\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:54:03.135232 master-0 kubenswrapper[6932]: I0319 11:54:03.135220 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-client-ca\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:54:03.135412 master-0 kubenswrapper[6932]: I0319 11:54:03.135332 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-serving-cert\") pod \"route-controller-manager-864f875b6b-rcjvd\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:54:03.240485 master-0 kubenswrapper[6932]: I0319 11:54:03.239804 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76cf2b01-33d9-47eb-be5d-44946c78bf20-serving-cert\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:54:03.240485 master-0 kubenswrapper[6932]: I0319 11:54:03.240071 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-config\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:54:03.240485 master-0 kubenswrapper[6932]: I0319 11:54:03.240136 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-client-ca\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:54:03.240485 master-0 kubenswrapper[6932]: I0319 11:54:03.240174 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-serving-cert\") pod \"route-controller-manager-864f875b6b-rcjvd\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:54:03.240485 master-0 kubenswrapper[6932]: I0319 11:54:03.240330 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5skx\" (UniqueName: \"kubernetes.io/projected/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-kube-api-access-n5skx\") pod \"route-controller-manager-864f875b6b-rcjvd\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:54:03.240485 master-0 kubenswrapper[6932]: I0319 11:54:03.240391 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj527\" (UniqueName: \"kubernetes.io/projected/76cf2b01-33d9-47eb-be5d-44946c78bf20-kube-api-access-nj527\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:54:03.240999 master-0 kubenswrapper[6932]: I0319 11:54:03.240622 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-client-ca\") pod \"route-controller-manager-864f875b6b-rcjvd\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:54:03.240999 master-0 kubenswrapper[6932]: I0319 11:54:03.240650 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-proxy-ca-bundles\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:54:03.240999 master-0 kubenswrapper[6932]: I0319 11:54:03.240670 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-config\") pod \"route-controller-manager-864f875b6b-rcjvd\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:54:03.241119 master-0 kubenswrapper[6932]: I0319 11:54:03.241095 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-client-ca\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:54:03.243616 master-0 kubenswrapper[6932]: I0319 11:54:03.241617 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-config\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:54:03.243616 master-0 kubenswrapper[6932]: I0319 11:54:03.242945 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-proxy-ca-bundles\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:54:03.245493 master-0 kubenswrapper[6932]: I0319 11:54:03.245435 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76cf2b01-33d9-47eb-be5d-44946c78bf20-serving-cert\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:54:03.246629 master-0 kubenswrapper[6932]: I0319 11:54:03.246581 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-client-ca\") pod \"route-controller-manager-864f875b6b-rcjvd\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:54:03.247699 master-0 kubenswrapper[6932]: I0319 11:54:03.247664 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-serving-cert\") pod \"route-controller-manager-864f875b6b-rcjvd\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:54:03.261169 master-0 kubenswrapper[6932]: I0319 11:54:03.259768 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-config\") pod \"route-controller-manager-864f875b6b-rcjvd\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:54:03.264803 master-0 kubenswrapper[6932]: I0319 11:54:03.263829 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj527\" (UniqueName: \"kubernetes.io/projected/76cf2b01-33d9-47eb-be5d-44946c78bf20-kube-api-access-nj527\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:54:03.270683 master-0 kubenswrapper[6932]: I0319 11:54:03.270586 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5skx\" (UniqueName: \"kubernetes.io/projected/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-kube-api-access-n5skx\") pod \"route-controller-manager-864f875b6b-rcjvd\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:54:03.379845 master-0 kubenswrapper[6932]: I0319 11:54:03.379252 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:54:03.390517 master-0 kubenswrapper[6932]: I0319 11:54:03.389490 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:54:03.571511 master-0 kubenswrapper[6932]: I0319 11:54:03.571301 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" event={"ID":"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7","Type":"ContainerStarted","Data":"975e632bf87b61a6785fc741d9417b8abbd6243ba2abd8088f9fe581fcfef90c"} Mar 19 11:54:03.573888 master-0 kubenswrapper[6932]: I0319 11:54:03.573848 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" event={"ID":"b3de8a1b-a5be-414f-86e8-738e16c8bc97","Type":"ContainerStarted","Data":"e6908445d4f9d29994371a77f0165de1617d0b3d69f7e33acfc73003f26e2111"} Mar 19 11:54:03.575861 master-0 kubenswrapper[6932]: I0319 11:54:03.575836 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" event={"ID":"e5078f17-bc65-460f-9f18-8c506db6840b","Type":"ContainerStarted","Data":"ad6eea5cbfaee49e4b15cd2882873d731a418d8ee3928b9f990016e9dfd746c5"} Mar 19 11:54:03.575921 master-0 kubenswrapper[6932]: I0319 11:54:03.575863 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" event={"ID":"e5078f17-bc65-460f-9f18-8c506db6840b","Type":"ContainerStarted","Data":"82ce23dbad1fafac03170cf8dbdc37b1358bba5d494b0305bc59731ec33ac062"} Mar 19 11:54:03.577416 master-0 kubenswrapper[6932]: I0319 11:54:03.577385 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" event={"ID":"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9","Type":"ContainerStarted","Data":"14576382107dc09a133f25dfe11c859b57d691f83816910915dfdbd5db8c6773"} Mar 19 11:54:03.581366 master-0 kubenswrapper[6932]: I0319 11:54:03.581327 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"870e66ff-82ed-4c91-8197-dddcb78048c2","Type":"ContainerStarted","Data":"42a335ff2e41047c0beba4d30a5bd16330153a9f1ce92821358c191efd6f3fc9"} Mar 19 11:54:03.581433 master-0 kubenswrapper[6932]: I0319 11:54:03.581372 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"870e66ff-82ed-4c91-8197-dddcb78048c2","Type":"ContainerStarted","Data":"e775e2237b3e13100c3e1ab188e2c83cffcee4d11a252841dd57b5b92e5e9841"} Mar 19 11:54:03.591349 master-0 kubenswrapper[6932]: I0319 11:54:03.591086 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" event={"ID":"716c2176-50f9-4c4f-af0e-4c7973457df2","Type":"ContainerStarted","Data":"862dbe8b648f15ee3ab2e74272152e657f518e1985ef0d38baf17c28a33a4abb"} Mar 19 11:54:03.595114 master-0 kubenswrapper[6932]: I0319 11:54:03.595056 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f6wv7" event={"ID":"f29b11ce-60e0-46b3-8d28-eea3452513cd","Type":"ContainerStarted","Data":"2c0b2d29cecf537e4921aa4396580e1259d6519819244de28dc54db9b3eeb9d0"} Mar 19 11:54:03.603389 master-0 kubenswrapper[6932]: I0319 11:54:03.603322 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-3-master-0" podStartSLOduration=2.60330033 podStartE2EDuration="2.60330033s" podCreationTimestamp="2026-03-19 11:54:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:54:03.598029191 +0000 UTC m=+67.957089423" watchObservedRunningTime="2026-03-19 11:54:03.60330033 +0000 UTC m=+67.962360552" Mar 19 11:54:03.906825 master-0 kubenswrapper[6932]: W0319 11:54:03.906657 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76cf2b01_33d9_47eb_be5d_44946c78bf20.slice/crio-eab7f63dc5326173ea1e6327285462aa6a81c9b141ac54e3d2487017aec7ef32 WatchSource:0}: Error finding container eab7f63dc5326173ea1e6327285462aa6a81c9b141ac54e3d2487017aec7ef32: Status 404 returned error can't find the container with id eab7f63dc5326173ea1e6327285462aa6a81c9b141ac54e3d2487017aec7ef32 Mar 19 11:54:03.909936 master-0 kubenswrapper[6932]: I0319 11:54:03.909887 6932 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57761fc9-b8d5-4dd5-8882-9f33ef79111a" path="/var/lib/kubelet/pods/57761fc9-b8d5-4dd5-8882-9f33ef79111a/volumes" Mar 19 11:54:03.911151 master-0 kubenswrapper[6932]: I0319 11:54:03.911078 6932 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7d33d5f-797c-4491-a1e3-1506452d2aff" path="/var/lib/kubelet/pods/e7d33d5f-797c-4491-a1e3-1506452d2aff/volumes" Mar 19 11:54:03.920254 master-0 kubenswrapper[6932]: I0319 11:54:03.920064 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-548bb99f44-txbjj"] Mar 19 11:54:03.920254 master-0 kubenswrapper[6932]: I0319 11:54:03.920133 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd"] Mar 19 11:54:03.920413 master-0 kubenswrapper[6932]: W0319 11:54:03.920308 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5c0bb87_0d65_4d7c_9ddd_a4889f0ebb85.slice/crio-c0337cf9dcdc7cc749cac3adad0f44d0d5457a466ca84750f37317d1eb4a70f1 WatchSource:0}: Error finding container c0337cf9dcdc7cc749cac3adad0f44d0d5457a466ca84750f37317d1eb4a70f1: Status 404 returned error can't find the container with id c0337cf9dcdc7cc749cac3adad0f44d0d5457a466ca84750f37317d1eb4a70f1 Mar 19 11:54:04.619762 master-0 kubenswrapper[6932]: I0319 11:54:04.612549 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" event={"ID":"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85","Type":"ContainerStarted","Data":"f95e4fbff564a46ec14c0cd042e81bebe47f9478a757f230a7655159821666eb"} Mar 19 11:54:04.619762 master-0 kubenswrapper[6932]: I0319 11:54:04.612610 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" event={"ID":"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85","Type":"ContainerStarted","Data":"c0337cf9dcdc7cc749cac3adad0f44d0d5457a466ca84750f37317d1eb4a70f1"} Mar 19 11:54:04.619762 master-0 kubenswrapper[6932]: I0319 11:54:04.614755 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:54:04.627840 master-0 kubenswrapper[6932]: I0319 11:54:04.622979 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:54:04.627840 master-0 kubenswrapper[6932]: I0319 11:54:04.627337 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" event={"ID":"76cf2b01-33d9-47eb-be5d-44946c78bf20","Type":"ContainerStarted","Data":"87cbd1b5cfb2e78754584648c786a0ccf511cf3452d3bed2f55e931cc6e6e1b5"} Mar 19 11:54:04.627840 master-0 kubenswrapper[6932]: I0319 11:54:04.627386 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" event={"ID":"76cf2b01-33d9-47eb-be5d-44946c78bf20","Type":"ContainerStarted","Data":"eab7f63dc5326173ea1e6327285462aa6a81c9b141ac54e3d2487017aec7ef32"} Mar 19 11:54:04.627840 master-0 kubenswrapper[6932]: I0319 11:54:04.627405 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:54:04.650255 master-0 kubenswrapper[6932]: I0319 11:54:04.650191 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:54:04.650854 master-0 kubenswrapper[6932]: I0319 11:54:04.650649 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" podStartSLOduration=3.6506268779999997 podStartE2EDuration="3.650626878s" podCreationTimestamp="2026-03-19 11:54:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:54:04.647878856 +0000 UTC m=+69.006939098" watchObservedRunningTime="2026-03-19 11:54:04.650626878 +0000 UTC m=+69.009687090" Mar 19 11:54:04.691758 master-0 kubenswrapper[6932]: I0319 11:54:04.690802 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" podStartSLOduration=3.69078508 podStartE2EDuration="3.69078508s" podCreationTimestamp="2026-03-19 11:54:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:54:04.687000856 +0000 UTC m=+69.046061088" watchObservedRunningTime="2026-03-19 11:54:04.69078508 +0000 UTC m=+69.049845302" Mar 19 11:54:07.655582 master-0 kubenswrapper[6932]: I0319 11:54:07.655522 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" event={"ID":"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7","Type":"ContainerStarted","Data":"fd263d596db29c9074c9bdeb64bbf7299d71e22e2b7ef560f862c8a5aa1f42ef"} Mar 19 11:54:07.662379 master-0 kubenswrapper[6932]: I0319 11:54:07.662286 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" event={"ID":"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9","Type":"ContainerStarted","Data":"a877ef6ce6ae3d0c734f7004326169c4a34f2a88cb9e94e14d820acdeb499a26"} Mar 19 11:54:07.664644 master-0 kubenswrapper[6932]: I0319 11:54:07.664599 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" event={"ID":"b3de8a1b-a5be-414f-86e8-738e16c8bc97","Type":"ContainerStarted","Data":"ecca7c744f565812652616c950bf4c3ba074defb48c439f60ea10ec59b205e80"} Mar 19 11:54:07.665311 master-0 kubenswrapper[6932]: I0319 11:54:07.665276 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:54:07.669933 master-0 kubenswrapper[6932]: I0319 11:54:07.669898 6932 patch_prober.go:28] interesting pod/marketplace-operator-89ccd998f-bftt4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" start-of-body= Mar 19 11:54:07.670044 master-0 kubenswrapper[6932]: I0319 11:54:07.669950 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" podUID="b3de8a1b-a5be-414f-86e8-738e16c8bc97" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" Mar 19 11:54:08.678708 master-0 kubenswrapper[6932]: I0319 11:54:08.678634 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" event={"ID":"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7","Type":"ContainerStarted","Data":"0afce24a5d5f93336e577364d7c0df2f3a4ed2cf2501e8357b1b537f30d7ce5e"} Mar 19 11:54:08.681780 master-0 kubenswrapper[6932]: I0319 11:54:08.681751 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:54:09.688674 master-0 kubenswrapper[6932]: I0319 11:54:09.687746 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f6wv7" event={"ID":"f29b11ce-60e0-46b3-8d28-eea3452513cd","Type":"ContainerStarted","Data":"a22318be5dd5f7d34d14240c83a76aea09070e40bc5acce4f9b5c123f098d012"} Mar 19 11:54:09.688674 master-0 kubenswrapper[6932]: I0319 11:54:09.687849 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f6wv7" event={"ID":"f29b11ce-60e0-46b3-8d28-eea3452513cd","Type":"ContainerStarted","Data":"ad7b394ffee1f7dbf2729fc390602d373d1ee63eb4b902945dde9858335d1a1f"} Mar 19 11:54:12.278888 master-0 kubenswrapper[6932]: I0319 11:54:12.276776 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 19 11:54:12.278888 master-0 kubenswrapper[6932]: I0319 11:54:12.277034 6932 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/installer-1-master-0" podUID="c385dd73-4a25-4827-9c8f-d923afc782b7" containerName="installer" containerID="cri-o://168c2214cdbfaaf9c363e282042065f817a6654cada35b4a09dd8914621dd3a3" gracePeriod=30 Mar 19 11:54:12.571382 master-0 kubenswrapper[6932]: I0319 11:54:12.567141 6932 patch_prober.go:28] interesting pod/authentication-operator-5885bfd7f4-gqd94 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.10:8443/healthz\": dial tcp 10.128.0.10:8443: connect: connection refused" start-of-body= Mar 19 11:54:12.571382 master-0 kubenswrapper[6932]: I0319 11:54:12.567216 6932 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" podUID="732989c5-1b89-46f0-9917-b68613f7f005" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.10:8443/healthz\": dial tcp 10.128.0.10:8443: connect: connection refused" Mar 19 11:54:12.706061 master-0 kubenswrapper[6932]: I0319 11:54:12.705985 6932 generic.go:334] "Generic (PLEG): container finished" podID="732989c5-1b89-46f0-9917-b68613f7f005" containerID="4ee16bcaa03f25cf971556786ccb51f285719b794843e45ad52bd8134e676a54" exitCode=0 Mar 19 11:54:12.706274 master-0 kubenswrapper[6932]: I0319 11:54:12.706080 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" event={"ID":"732989c5-1b89-46f0-9917-b68613f7f005","Type":"ContainerDied","Data":"4ee16bcaa03f25cf971556786ccb51f285719b794843e45ad52bd8134e676a54"} Mar 19 11:54:12.706697 master-0 kubenswrapper[6932]: I0319 11:54:12.706674 6932 scope.go:117] "RemoveContainer" containerID="4ee16bcaa03f25cf971556786ccb51f285719b794843e45ad52bd8134e676a54" Mar 19 11:54:12.713305 master-0 kubenswrapper[6932]: I0319 11:54:12.713236 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" event={"ID":"e5078f17-bc65-460f-9f18-8c506db6840b","Type":"ContainerStarted","Data":"33100cc62ba3c2cb3d48555fa7a3a2daa0cd4660498a3f134f809d70cb042114"} Mar 19 11:54:12.713689 master-0 kubenswrapper[6932]: I0319 11:54:12.713647 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:54:12.716478 master-0 kubenswrapper[6932]: I0319 11:54:12.716432 6932 generic.go:334] "Generic (PLEG): container finished" podID="39d3ac31-9259-454b-8e1c-e23024f8f2b2" containerID="5e2f36e1befc8e73ca3645b7b8f74e7be8e2177e72629e38b72062f0d512ab82" exitCode=0 Mar 19 11:54:12.716559 master-0 kubenswrapper[6932]: I0319 11:54:12.716500 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" event={"ID":"39d3ac31-9259-454b-8e1c-e23024f8f2b2","Type":"ContainerDied","Data":"5e2f36e1befc8e73ca3645b7b8f74e7be8e2177e72629e38b72062f0d512ab82"} Mar 19 11:54:12.717031 master-0 kubenswrapper[6932]: I0319 11:54:12.716997 6932 scope.go:117] "RemoveContainer" containerID="5e2f36e1befc8e73ca3645b7b8f74e7be8e2177e72629e38b72062f0d512ab82" Mar 19 11:54:12.723668 master-0 kubenswrapper[6932]: I0319 11:54:12.723233 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" event={"ID":"cf08ab4f-c203-4c16-9826-8cc049f4af31","Type":"ContainerStarted","Data":"04269cec252a1ddf65705ab5398809758d361672b8dc175714b3a769acc94db8"} Mar 19 11:54:12.723668 master-0 kubenswrapper[6932]: I0319 11:54:12.723665 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:54:12.725340 master-0 kubenswrapper[6932]: I0319 11:54:12.725199 6932 patch_prober.go:28] interesting pod/catalog-operator-68f85b4d6c-n5gr9 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.8:8443/healthz\": dial tcp 10.128.0.8:8443: connect: connection refused" start-of-body= Mar 19 11:54:12.725340 master-0 kubenswrapper[6932]: I0319 11:54:12.725236 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" podUID="cf08ab4f-c203-4c16-9826-8cc049f4af31" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.128.0.8:8443/healthz\": dial tcp 10.128.0.8:8443: connect: connection refused" Mar 19 11:54:12.726839 master-0 kubenswrapper[6932]: I0319 11:54:12.726390 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:54:12.727279 master-0 kubenswrapper[6932]: I0319 11:54:12.727255 6932 patch_prober.go:28] interesting pod/olm-operator-5c9796789-l9sw9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.11:8443/healthz\": dial tcp 10.128.0.11:8443: connect: connection refused" start-of-body= Mar 19 11:54:12.727381 master-0 kubenswrapper[6932]: I0319 11:54:12.727284 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" podUID="716c2176-50f9-4c4f-af0e-4c7973457df2" containerName="olm-operator" probeResult="failure" output="Get \"https://10.128.0.11:8443/healthz\": dial tcp 10.128.0.11:8443: connect: connection refused" Mar 19 11:54:13.556028 master-0 kubenswrapper[6932]: I0319 11:54:13.555925 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-j7rc9"] Mar 19 11:54:13.556975 master-0 kubenswrapper[6932]: I0319 11:54:13.556943 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-j7rc9" Mar 19 11:54:13.560964 master-0 kubenswrapper[6932]: I0319 11:54:13.560913 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-7gswr" Mar 19 11:54:13.561277 master-0 kubenswrapper[6932]: I0319 11:54:13.561252 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 11:54:13.561567 master-0 kubenswrapper[6932]: I0319 11:54:13.561544 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 11:54:13.561847 master-0 kubenswrapper[6932]: I0319 11:54:13.561819 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 11:54:13.585487 master-0 kubenswrapper[6932]: I0319 11:54:13.585418 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-j7rc9"] Mar 19 11:54:13.650669 master-0 kubenswrapper[6932]: I0319 11:54:13.650609 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfxw7\" (UniqueName: \"kubernetes.io/projected/7a51eeaf-1349-4bf3-932d-22ed5ce7c161-kube-api-access-cfxw7\") pod \"control-plane-machine-set-operator-6f97756bc8-j7rc9\" (UID: \"7a51eeaf-1349-4bf3-932d-22ed5ce7c161\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-j7rc9" Mar 19 11:54:13.650669 master-0 kubenswrapper[6932]: I0319 11:54:13.650672 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7a51eeaf-1349-4bf3-932d-22ed5ce7c161-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6f97756bc8-j7rc9\" (UID: \"7a51eeaf-1349-4bf3-932d-22ed5ce7c161\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-j7rc9" Mar 19 11:54:13.731331 master-0 kubenswrapper[6932]: I0319 11:54:13.731274 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" event={"ID":"716c2176-50f9-4c4f-af0e-4c7973457df2","Type":"ContainerStarted","Data":"1a4c8f14e930c50b0952b3694943a840c7990a1c5b1da1e7239dd84dfb62334b"} Mar 19 11:54:13.735286 master-0 kubenswrapper[6932]: I0319 11:54:13.735237 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" event={"ID":"732989c5-1b89-46f0-9917-b68613f7f005","Type":"ContainerStarted","Data":"23db8a325d8ebadb5b1d1bab4b4a614762b90313ccb3adc271c971443cebddcd"} Mar 19 11:54:13.737681 master-0 kubenswrapper[6932]: I0319 11:54:13.737644 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" event={"ID":"39d3ac31-9259-454b-8e1c-e23024f8f2b2","Type":"ContainerStarted","Data":"3df2e0aeecd7a4f19adde17efa6cfd363a148088a31fc3ab4825b957d359b489"} Mar 19 11:54:13.746818 master-0 kubenswrapper[6932]: I0319 11:54:13.746778 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:54:13.748197 master-0 kubenswrapper[6932]: I0319 11:54:13.748169 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:54:13.751817 master-0 kubenswrapper[6932]: I0319 11:54:13.751760 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfxw7\" (UniqueName: \"kubernetes.io/projected/7a51eeaf-1349-4bf3-932d-22ed5ce7c161-kube-api-access-cfxw7\") pod \"control-plane-machine-set-operator-6f97756bc8-j7rc9\" (UID: \"7a51eeaf-1349-4bf3-932d-22ed5ce7c161\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-j7rc9" Mar 19 11:54:13.751817 master-0 kubenswrapper[6932]: I0319 11:54:13.751819 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7a51eeaf-1349-4bf3-932d-22ed5ce7c161-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6f97756bc8-j7rc9\" (UID: \"7a51eeaf-1349-4bf3-932d-22ed5ce7c161\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-j7rc9" Mar 19 11:54:13.767691 master-0 kubenswrapper[6932]: I0319 11:54:13.767613 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7a51eeaf-1349-4bf3-932d-22ed5ce7c161-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6f97756bc8-j7rc9\" (UID: \"7a51eeaf-1349-4bf3-932d-22ed5ce7c161\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-j7rc9" Mar 19 11:54:13.779056 master-0 kubenswrapper[6932]: I0319 11:54:13.779007 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfxw7\" (UniqueName: \"kubernetes.io/projected/7a51eeaf-1349-4bf3-932d-22ed5ce7c161-kube-api-access-cfxw7\") pod \"control-plane-machine-set-operator-6f97756bc8-j7rc9\" (UID: \"7a51eeaf-1349-4bf3-932d-22ed5ce7c161\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-j7rc9" Mar 19 11:54:13.886792 master-0 kubenswrapper[6932]: I0319 11:54:13.883898 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rrvxk"] Mar 19 11:54:13.886792 master-0 kubenswrapper[6932]: I0319 11:54:13.884854 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rrvxk" Mar 19 11:54:13.893389 master-0 kubenswrapper[6932]: I0319 11:54:13.893300 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-j7rc9" Mar 19 11:54:13.893551 master-0 kubenswrapper[6932]: I0319 11:54:13.893486 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-6zbld" Mar 19 11:54:13.916972 master-0 kubenswrapper[6932]: I0319 11:54:13.916885 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rrvxk"] Mar 19 11:54:14.055704 master-0 kubenswrapper[6932]: I0319 11:54:14.055626 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9-utilities\") pod \"community-operators-rrvxk\" (UID: \"64f5cbf1-f761-4531-8e5c-1f9b318b0cb9\") " pod="openshift-marketplace/community-operators-rrvxk" Mar 19 11:54:14.056018 master-0 kubenswrapper[6932]: I0319 11:54:14.055722 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9-catalog-content\") pod \"community-operators-rrvxk\" (UID: \"64f5cbf1-f761-4531-8e5c-1f9b318b0cb9\") " pod="openshift-marketplace/community-operators-rrvxk" Mar 19 11:54:14.056018 master-0 kubenswrapper[6932]: I0319 11:54:14.055773 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsdjb\" (UniqueName: \"kubernetes.io/projected/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9-kube-api-access-lsdjb\") pod \"community-operators-rrvxk\" (UID: \"64f5cbf1-f761-4531-8e5c-1f9b318b0cb9\") " pod="openshift-marketplace/community-operators-rrvxk" Mar 19 11:54:14.078554 master-0 kubenswrapper[6932]: I0319 11:54:14.078183 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pn57d"] Mar 19 11:54:14.079243 master-0 kubenswrapper[6932]: I0319 11:54:14.079180 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pn57d" Mar 19 11:54:14.103877 master-0 kubenswrapper[6932]: I0319 11:54:14.103778 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-6nq75" Mar 19 11:54:14.110510 master-0 kubenswrapper[6932]: I0319 11:54:14.109575 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pn57d"] Mar 19 11:54:14.160434 master-0 kubenswrapper[6932]: I0319 11:54:14.157610 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9-catalog-content\") pod \"community-operators-rrvxk\" (UID: \"64f5cbf1-f761-4531-8e5c-1f9b318b0cb9\") " pod="openshift-marketplace/community-operators-rrvxk" Mar 19 11:54:14.160434 master-0 kubenswrapper[6932]: I0319 11:54:14.157886 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsdjb\" (UniqueName: \"kubernetes.io/projected/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9-kube-api-access-lsdjb\") pod \"community-operators-rrvxk\" (UID: \"64f5cbf1-f761-4531-8e5c-1f9b318b0cb9\") " pod="openshift-marketplace/community-operators-rrvxk" Mar 19 11:54:14.160434 master-0 kubenswrapper[6932]: I0319 11:54:14.157967 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9-utilities\") pod \"community-operators-rrvxk\" (UID: \"64f5cbf1-f761-4531-8e5c-1f9b318b0cb9\") " pod="openshift-marketplace/community-operators-rrvxk" Mar 19 11:54:14.160434 master-0 kubenswrapper[6932]: I0319 11:54:14.158510 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9-utilities\") pod \"community-operators-rrvxk\" (UID: \"64f5cbf1-f761-4531-8e5c-1f9b318b0cb9\") " pod="openshift-marketplace/community-operators-rrvxk" Mar 19 11:54:14.160434 master-0 kubenswrapper[6932]: I0319 11:54:14.158820 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9-catalog-content\") pod \"community-operators-rrvxk\" (UID: \"64f5cbf1-f761-4531-8e5c-1f9b318b0cb9\") " pod="openshift-marketplace/community-operators-rrvxk" Mar 19 11:54:14.174099 master-0 kubenswrapper[6932]: I0319 11:54:14.173855 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsdjb\" (UniqueName: \"kubernetes.io/projected/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9-kube-api-access-lsdjb\") pod \"community-operators-rrvxk\" (UID: \"64f5cbf1-f761-4531-8e5c-1f9b318b0cb9\") " pod="openshift-marketplace/community-operators-rrvxk" Mar 19 11:54:14.258841 master-0 kubenswrapper[6932]: I0319 11:54:14.258790 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c-catalog-content\") pod \"certified-operators-pn57d\" (UID: \"ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c\") " pod="openshift-marketplace/certified-operators-pn57d" Mar 19 11:54:14.259073 master-0 kubenswrapper[6932]: I0319 11:54:14.258857 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g45kt\" (UniqueName: \"kubernetes.io/projected/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c-kube-api-access-g45kt\") pod \"certified-operators-pn57d\" (UID: \"ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c\") " pod="openshift-marketplace/certified-operators-pn57d" Mar 19 11:54:14.259073 master-0 kubenswrapper[6932]: I0319 11:54:14.258891 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c-utilities\") pod \"certified-operators-pn57d\" (UID: \"ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c\") " pod="openshift-marketplace/certified-operators-pn57d" Mar 19 11:54:14.272752 master-0 kubenswrapper[6932]: I0319 11:54:14.272126 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rrvxk" Mar 19 11:54:14.360569 master-0 kubenswrapper[6932]: I0319 11:54:14.360519 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c-catalog-content\") pod \"certified-operators-pn57d\" (UID: \"ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c\") " pod="openshift-marketplace/certified-operators-pn57d" Mar 19 11:54:14.360885 master-0 kubenswrapper[6932]: I0319 11:54:14.360588 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g45kt\" (UniqueName: \"kubernetes.io/projected/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c-kube-api-access-g45kt\") pod \"certified-operators-pn57d\" (UID: \"ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c\") " pod="openshift-marketplace/certified-operators-pn57d" Mar 19 11:54:14.361051 master-0 kubenswrapper[6932]: I0319 11:54:14.360973 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c-utilities\") pod \"certified-operators-pn57d\" (UID: \"ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c\") " pod="openshift-marketplace/certified-operators-pn57d" Mar 19 11:54:14.361109 master-0 kubenswrapper[6932]: I0319 11:54:14.361082 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c-catalog-content\") pod \"certified-operators-pn57d\" (UID: \"ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c\") " pod="openshift-marketplace/certified-operators-pn57d" Mar 19 11:54:14.361397 master-0 kubenswrapper[6932]: I0319 11:54:14.361372 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c-utilities\") pod \"certified-operators-pn57d\" (UID: \"ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c\") " pod="openshift-marketplace/certified-operators-pn57d" Mar 19 11:54:14.488114 master-0 kubenswrapper[6932]: I0319 11:54:14.488077 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_96498b3d-c93f-4b42-a0aa-2afec3450b1d/installer/0.log" Mar 19 11:54:14.488197 master-0 kubenswrapper[6932]: I0319 11:54:14.488145 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:54:14.666439 master-0 kubenswrapper[6932]: I0319 11:54:14.666356 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/96498b3d-c93f-4b42-a0aa-2afec3450b1d-var-lock\") pod \"96498b3d-c93f-4b42-a0aa-2afec3450b1d\" (UID: \"96498b3d-c93f-4b42-a0aa-2afec3450b1d\") " Mar 19 11:54:14.667130 master-0 kubenswrapper[6932]: I0319 11:54:14.666490 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96498b3d-c93f-4b42-a0aa-2afec3450b1d-var-lock" (OuterVolumeSpecName: "var-lock") pod "96498b3d-c93f-4b42-a0aa-2afec3450b1d" (UID: "96498b3d-c93f-4b42-a0aa-2afec3450b1d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:54:14.667130 master-0 kubenswrapper[6932]: I0319 11:54:14.666539 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96498b3d-c93f-4b42-a0aa-2afec3450b1d-kube-api-access\") pod \"96498b3d-c93f-4b42-a0aa-2afec3450b1d\" (UID: \"96498b3d-c93f-4b42-a0aa-2afec3450b1d\") " Mar 19 11:54:14.667130 master-0 kubenswrapper[6932]: I0319 11:54:14.666653 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/96498b3d-c93f-4b42-a0aa-2afec3450b1d-kubelet-dir\") pod \"96498b3d-c93f-4b42-a0aa-2afec3450b1d\" (UID: \"96498b3d-c93f-4b42-a0aa-2afec3450b1d\") " Mar 19 11:54:14.667130 master-0 kubenswrapper[6932]: I0319 11:54:14.666772 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96498b3d-c93f-4b42-a0aa-2afec3450b1d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "96498b3d-c93f-4b42-a0aa-2afec3450b1d" (UID: "96498b3d-c93f-4b42-a0aa-2afec3450b1d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:54:14.668347 master-0 kubenswrapper[6932]: I0319 11:54:14.667311 6932 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/96498b3d-c93f-4b42-a0aa-2afec3450b1d-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:14.668347 master-0 kubenswrapper[6932]: I0319 11:54:14.667364 6932 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/96498b3d-c93f-4b42-a0aa-2afec3450b1d-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:14.674163 master-0 kubenswrapper[6932]: I0319 11:54:14.674111 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96498b3d-c93f-4b42-a0aa-2afec3450b1d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "96498b3d-c93f-4b42-a0aa-2afec3450b1d" (UID: "96498b3d-c93f-4b42-a0aa-2afec3450b1d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:54:14.743053 master-0 kubenswrapper[6932]: I0319 11:54:14.742894 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_96498b3d-c93f-4b42-a0aa-2afec3450b1d/installer/0.log" Mar 19 11:54:14.743053 master-0 kubenswrapper[6932]: I0319 11:54:14.742950 6932 generic.go:334] "Generic (PLEG): container finished" podID="96498b3d-c93f-4b42-a0aa-2afec3450b1d" containerID="6190c5ce6a4577438f33206ca4dae98830aac87d35d2a1c7d5b529f64a571efc" exitCode=1 Mar 19 11:54:14.743053 master-0 kubenswrapper[6932]: I0319 11:54:14.743026 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:54:14.743053 master-0 kubenswrapper[6932]: I0319 11:54:14.743034 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"96498b3d-c93f-4b42-a0aa-2afec3450b1d","Type":"ContainerDied","Data":"6190c5ce6a4577438f33206ca4dae98830aac87d35d2a1c7d5b529f64a571efc"} Mar 19 11:54:14.743874 master-0 kubenswrapper[6932]: I0319 11:54:14.743103 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"96498b3d-c93f-4b42-a0aa-2afec3450b1d","Type":"ContainerDied","Data":"b0eeab0f4b63d0b832bcb033f60d90bd7a9ab1aefa13cc2a83e1411234017f43"} Mar 19 11:54:14.743874 master-0 kubenswrapper[6932]: I0319 11:54:14.743128 6932 scope.go:117] "RemoveContainer" containerID="6190c5ce6a4577438f33206ca4dae98830aac87d35d2a1c7d5b529f64a571efc" Mar 19 11:54:14.744469 master-0 kubenswrapper[6932]: I0319 11:54:14.744422 6932 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 19 11:54:14.744719 master-0 kubenswrapper[6932]: I0319 11:54:14.744677 6932 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcdctl" containerID="cri-o://b9695d95c55ced36d31ee4b3802610d675e3206471662e3165ad086a92a3332c" gracePeriod=30 Mar 19 11:54:14.744810 master-0 kubenswrapper[6932]: I0319 11:54:14.744688 6932 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcd" containerID="cri-o://863c1805db648a1c68dab43606fe7bf357e1d14504af2989916cb369fe922861" gracePeriod=30 Mar 19 11:54:14.760530 master-0 kubenswrapper[6932]: I0319 11:54:14.752384 6932 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 19 11:54:14.760530 master-0 kubenswrapper[6932]: E0319 11:54:14.752670 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcd" Mar 19 11:54:14.760530 master-0 kubenswrapper[6932]: I0319 11:54:14.752684 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcd" Mar 19 11:54:14.760530 master-0 kubenswrapper[6932]: E0319 11:54:14.752705 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96498b3d-c93f-4b42-a0aa-2afec3450b1d" containerName="installer" Mar 19 11:54:14.760530 master-0 kubenswrapper[6932]: I0319 11:54:14.752936 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="96498b3d-c93f-4b42-a0aa-2afec3450b1d" containerName="installer" Mar 19 11:54:14.760530 master-0 kubenswrapper[6932]: E0319 11:54:14.752957 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcdctl" Mar 19 11:54:14.760530 master-0 kubenswrapper[6932]: I0319 11:54:14.752967 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcdctl" Mar 19 11:54:14.760530 master-0 kubenswrapper[6932]: I0319 11:54:14.753092 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcd" Mar 19 11:54:14.760530 master-0 kubenswrapper[6932]: I0319 11:54:14.753109 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcdctl" Mar 19 11:54:14.760530 master-0 kubenswrapper[6932]: I0319 11:54:14.753120 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="96498b3d-c93f-4b42-a0aa-2afec3450b1d" containerName="installer" Mar 19 11:54:14.768415 master-0 kubenswrapper[6932]: I0319 11:54:14.761212 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 19 11:54:14.768415 master-0 kubenswrapper[6932]: I0319 11:54:14.763556 6932 scope.go:117] "RemoveContainer" containerID="6190c5ce6a4577438f33206ca4dae98830aac87d35d2a1c7d5b529f64a571efc" Mar 19 11:54:14.769936 master-0 kubenswrapper[6932]: E0319 11:54:14.769867 6932 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6190c5ce6a4577438f33206ca4dae98830aac87d35d2a1c7d5b529f64a571efc\": container with ID starting with 6190c5ce6a4577438f33206ca4dae98830aac87d35d2a1c7d5b529f64a571efc not found: ID does not exist" containerID="6190c5ce6a4577438f33206ca4dae98830aac87d35d2a1c7d5b529f64a571efc" Mar 19 11:54:14.770101 master-0 kubenswrapper[6932]: I0319 11:54:14.769948 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6190c5ce6a4577438f33206ca4dae98830aac87d35d2a1c7d5b529f64a571efc"} err="failed to get container status \"6190c5ce6a4577438f33206ca4dae98830aac87d35d2a1c7d5b529f64a571efc\": rpc error: code = NotFound desc = could not find container \"6190c5ce6a4577438f33206ca4dae98830aac87d35d2a1c7d5b529f64a571efc\": container with ID starting with 6190c5ce6a4577438f33206ca4dae98830aac87d35d2a1c7d5b529f64a571efc not found: ID does not exist" Mar 19 11:54:14.783949 master-0 kubenswrapper[6932]: I0319 11:54:14.771824 6932 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96498b3d-c93f-4b42-a0aa-2afec3450b1d-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:14.873379 master-0 kubenswrapper[6932]: I0319 11:54:14.873307 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:54:14.873379 master-0 kubenswrapper[6932]: I0319 11:54:14.873371 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:54:14.873379 master-0 kubenswrapper[6932]: I0319 11:54:14.873391 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:54:14.873675 master-0 kubenswrapper[6932]: I0319 11:54:14.873411 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:54:14.873675 master-0 kubenswrapper[6932]: I0319 11:54:14.873433 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:54:14.873675 master-0 kubenswrapper[6932]: I0319 11:54:14.873473 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:54:14.974642 master-0 kubenswrapper[6932]: I0319 11:54:14.974554 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:54:14.974966 master-0 kubenswrapper[6932]: I0319 11:54:14.974661 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:54:14.974966 master-0 kubenswrapper[6932]: I0319 11:54:14.974719 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:54:14.974966 master-0 kubenswrapper[6932]: I0319 11:54:14.974760 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:54:14.974966 master-0 kubenswrapper[6932]: I0319 11:54:14.974787 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:54:14.974966 master-0 kubenswrapper[6932]: I0319 11:54:14.974818 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:54:14.974966 master-0 kubenswrapper[6932]: I0319 11:54:14.974846 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:54:14.974966 master-0 kubenswrapper[6932]: I0319 11:54:14.974907 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:54:14.974966 master-0 kubenswrapper[6932]: I0319 11:54:14.974936 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:54:14.974966 master-0 kubenswrapper[6932]: I0319 11:54:14.974964 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:54:14.975213 master-0 kubenswrapper[6932]: I0319 11:54:14.974992 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:54:14.975213 master-0 kubenswrapper[6932]: I0319 11:54:14.975018 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:54:15.033874 master-0 kubenswrapper[6932]: I0319 11:54:15.030410 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g45kt\" (UniqueName: \"kubernetes.io/projected/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c-kube-api-access-g45kt\") pod \"certified-operators-pn57d\" (UID: \"ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c\") " pod="openshift-marketplace/certified-operators-pn57d" Mar 19 11:54:15.164650 master-0 kubenswrapper[6932]: I0319 11:54:15.164575 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-j7rc9"] Mar 19 11:54:15.173112 master-0 kubenswrapper[6932]: W0319 11:54:15.173063 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a51eeaf_1349_4bf3_932d_22ed5ce7c161.slice/crio-738ee83a26962b58779f847316e3a8a5be1d6bd92f0c0f29c25cdbb8703c5c59 WatchSource:0}: Error finding container 738ee83a26962b58779f847316e3a8a5be1d6bd92f0c0f29c25cdbb8703c5c59: Status 404 returned error can't find the container with id 738ee83a26962b58779f847316e3a8a5be1d6bd92f0c0f29c25cdbb8703c5c59 Mar 19 11:54:15.323984 master-0 kubenswrapper[6932]: I0319 11:54:15.323106 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pn57d" Mar 19 11:54:15.749630 master-0 kubenswrapper[6932]: I0319 11:54:15.749539 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-j7rc9" event={"ID":"7a51eeaf-1349-4bf3-932d-22ed5ce7c161","Type":"ContainerStarted","Data":"738ee83a26962b58779f847316e3a8a5be1d6bd92f0c0f29c25cdbb8703c5c59"} Mar 19 11:54:17.760881 master-0 kubenswrapper[6932]: I0319 11:54:17.760813 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-j7rc9" event={"ID":"7a51eeaf-1349-4bf3-932d-22ed5ce7c161","Type":"ContainerStarted","Data":"fba15ac5fd8638fa2d8fe5188431cf574d56ecf14fb3b1611a5b61dc6246db85"} Mar 19 11:54:27.813481 master-0 kubenswrapper[6932]: E0319 11:54:27.813422 6932 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 19 11:54:27.814025 master-0 kubenswrapper[6932]: I0319 11:54:27.813895 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 19 11:54:27.828810 master-0 kubenswrapper[6932]: W0319 11:54:27.828765 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24b4ed170d527099878cb5fdd508a2fb.slice/crio-9a2616ea0257b4942755b9e9fb23bb4dfd3518f40e9ffe96a9ef4230caaa00fe WatchSource:0}: Error finding container 9a2616ea0257b4942755b9e9fb23bb4dfd3518f40e9ffe96a9ef4230caaa00fe: Status 404 returned error can't find the container with id 9a2616ea0257b4942755b9e9fb23bb4dfd3518f40e9ffe96a9ef4230caaa00fe Mar 19 11:54:28.815913 master-0 kubenswrapper[6932]: I0319 11:54:28.815882 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_c385dd73-4a25-4827-9c8f-d923afc782b7/installer/0.log" Mar 19 11:54:28.816300 master-0 kubenswrapper[6932]: I0319 11:54:28.815930 6932 generic.go:334] "Generic (PLEG): container finished" podID="c385dd73-4a25-4827-9c8f-d923afc782b7" containerID="168c2214cdbfaaf9c363e282042065f817a6654cada35b4a09dd8914621dd3a3" exitCode=1 Mar 19 11:54:28.816300 master-0 kubenswrapper[6932]: I0319 11:54:28.815982 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"c385dd73-4a25-4827-9c8f-d923afc782b7","Type":"ContainerDied","Data":"168c2214cdbfaaf9c363e282042065f817a6654cada35b4a09dd8914621dd3a3"} Mar 19 11:54:28.817986 master-0 kubenswrapper[6932]: I0319 11:54:28.817894 6932 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="8abd4bc13ae0709fc6342131dbb0dfd5a762a5ca0945cd22f3346298ea10ec64" exitCode=0 Mar 19 11:54:28.818063 master-0 kubenswrapper[6932]: I0319 11:54:28.817972 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"8abd4bc13ae0709fc6342131dbb0dfd5a762a5ca0945cd22f3346298ea10ec64"} Mar 19 11:54:28.818063 master-0 kubenswrapper[6932]: I0319 11:54:28.818018 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"9a2616ea0257b4942755b9e9fb23bb4dfd3518f40e9ffe96a9ef4230caaa00fe"} Mar 19 11:54:28.820239 master-0 kubenswrapper[6932]: I0319 11:54:28.820203 6932 generic.go:334] "Generic (PLEG): container finished" podID="6bde080b-3820-463f-a27d-9fb9a7843d5d" containerID="bdd2ba95a96b40f792db569b1a38d500c6161c9b6b35b6b22d8099e9a3a35339" exitCode=0 Mar 19 11:54:28.820239 master-0 kubenswrapper[6932]: I0319 11:54:28.820239 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"6bde080b-3820-463f-a27d-9fb9a7843d5d","Type":"ContainerDied","Data":"bdd2ba95a96b40f792db569b1a38d500c6161c9b6b35b6b22d8099e9a3a35339"} Mar 19 11:54:29.089576 master-0 kubenswrapper[6932]: I0319 11:54:29.089538 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_c385dd73-4a25-4827-9c8f-d923afc782b7/installer/0.log" Mar 19 11:54:29.089749 master-0 kubenswrapper[6932]: I0319 11:54:29.089618 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:54:29.268774 master-0 kubenswrapper[6932]: I0319 11:54:29.268696 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c385dd73-4a25-4827-9c8f-d923afc782b7-kube-api-access\") pod \"c385dd73-4a25-4827-9c8f-d923afc782b7\" (UID: \"c385dd73-4a25-4827-9c8f-d923afc782b7\") " Mar 19 11:54:29.269068 master-0 kubenswrapper[6932]: I0319 11:54:29.268838 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c385dd73-4a25-4827-9c8f-d923afc782b7-kubelet-dir\") pod \"c385dd73-4a25-4827-9c8f-d923afc782b7\" (UID: \"c385dd73-4a25-4827-9c8f-d923afc782b7\") " Mar 19 11:54:29.269068 master-0 kubenswrapper[6932]: I0319 11:54:29.268880 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c385dd73-4a25-4827-9c8f-d923afc782b7-var-lock\") pod \"c385dd73-4a25-4827-9c8f-d923afc782b7\" (UID: \"c385dd73-4a25-4827-9c8f-d923afc782b7\") " Mar 19 11:54:29.269068 master-0 kubenswrapper[6932]: I0319 11:54:29.268952 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c385dd73-4a25-4827-9c8f-d923afc782b7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c385dd73-4a25-4827-9c8f-d923afc782b7" (UID: "c385dd73-4a25-4827-9c8f-d923afc782b7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:54:29.269068 master-0 kubenswrapper[6932]: I0319 11:54:29.269058 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c385dd73-4a25-4827-9c8f-d923afc782b7-var-lock" (OuterVolumeSpecName: "var-lock") pod "c385dd73-4a25-4827-9c8f-d923afc782b7" (UID: "c385dd73-4a25-4827-9c8f-d923afc782b7"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:54:29.269254 master-0 kubenswrapper[6932]: I0319 11:54:29.269085 6932 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c385dd73-4a25-4827-9c8f-d923afc782b7-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:29.271507 master-0 kubenswrapper[6932]: I0319 11:54:29.271459 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c385dd73-4a25-4827-9c8f-d923afc782b7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c385dd73-4a25-4827-9c8f-d923afc782b7" (UID: "c385dd73-4a25-4827-9c8f-d923afc782b7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:54:29.370936 master-0 kubenswrapper[6932]: I0319 11:54:29.370847 6932 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c385dd73-4a25-4827-9c8f-d923afc782b7-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:29.370936 master-0 kubenswrapper[6932]: I0319 11:54:29.370919 6932 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c385dd73-4a25-4827-9c8f-d923afc782b7-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:29.827000 master-0 kubenswrapper[6932]: I0319 11:54:29.826903 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_c385dd73-4a25-4827-9c8f-d923afc782b7/installer/0.log" Mar 19 11:54:29.827816 master-0 kubenswrapper[6932]: I0319 11:54:29.827009 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"c385dd73-4a25-4827-9c8f-d923afc782b7","Type":"ContainerDied","Data":"3cb65febed88a73558fad96ec47a3c975afac21bc24c0f69a6eaebb5e72b8a31"} Mar 19 11:54:29.827816 master-0 kubenswrapper[6932]: I0319 11:54:29.827329 6932 scope.go:117] "RemoveContainer" containerID="168c2214cdbfaaf9c363e282042065f817a6654cada35b4a09dd8914621dd3a3" Mar 19 11:54:29.827816 master-0 kubenswrapper[6932]: I0319 11:54:29.827071 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:54:29.829293 master-0 kubenswrapper[6932]: I0319 11:54:29.829250 6932 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="58cc59848776f9368dd32da99bd6c9b9284f95df012df470d98ae16fe81785f6" exitCode=1 Mar 19 11:54:29.829439 master-0 kubenswrapper[6932]: I0319 11:54:29.829331 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerDied","Data":"58cc59848776f9368dd32da99bd6c9b9284f95df012df470d98ae16fe81785f6"} Mar 19 11:54:29.830181 master-0 kubenswrapper[6932]: I0319 11:54:29.830064 6932 scope.go:117] "RemoveContainer" containerID="58cc59848776f9368dd32da99bd6c9b9284f95df012df470d98ae16fe81785f6" Mar 19 11:54:29.907006 master-0 kubenswrapper[6932]: E0319 11:54:29.906944 6932 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:54:30.131029 master-0 kubenswrapper[6932]: I0319 11:54:30.130972 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 19 11:54:30.282156 master-0 kubenswrapper[6932]: I0319 11:54:30.282091 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6bde080b-3820-463f-a27d-9fb9a7843d5d-kubelet-dir\") pod \"6bde080b-3820-463f-a27d-9fb9a7843d5d\" (UID: \"6bde080b-3820-463f-a27d-9fb9a7843d5d\") " Mar 19 11:54:30.282560 master-0 kubenswrapper[6932]: I0319 11:54:30.282255 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bde080b-3820-463f-a27d-9fb9a7843d5d-kube-api-access\") pod \"6bde080b-3820-463f-a27d-9fb9a7843d5d\" (UID: \"6bde080b-3820-463f-a27d-9fb9a7843d5d\") " Mar 19 11:54:30.282644 master-0 kubenswrapper[6932]: I0319 11:54:30.282608 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6bde080b-3820-463f-a27d-9fb9a7843d5d-var-lock\") pod \"6bde080b-3820-463f-a27d-9fb9a7843d5d\" (UID: \"6bde080b-3820-463f-a27d-9fb9a7843d5d\") " Mar 19 11:54:30.282772 master-0 kubenswrapper[6932]: I0319 11:54:30.282694 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6bde080b-3820-463f-a27d-9fb9a7843d5d-var-lock" (OuterVolumeSpecName: "var-lock") pod "6bde080b-3820-463f-a27d-9fb9a7843d5d" (UID: "6bde080b-3820-463f-a27d-9fb9a7843d5d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:54:30.283123 master-0 kubenswrapper[6932]: I0319 11:54:30.283096 6932 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6bde080b-3820-463f-a27d-9fb9a7843d5d-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:30.283183 master-0 kubenswrapper[6932]: I0319 11:54:30.283166 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6bde080b-3820-463f-a27d-9fb9a7843d5d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6bde080b-3820-463f-a27d-9fb9a7843d5d" (UID: "6bde080b-3820-463f-a27d-9fb9a7843d5d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:54:30.285604 master-0 kubenswrapper[6932]: I0319 11:54:30.285564 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bde080b-3820-463f-a27d-9fb9a7843d5d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6bde080b-3820-463f-a27d-9fb9a7843d5d" (UID: "6bde080b-3820-463f-a27d-9fb9a7843d5d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:54:30.383977 master-0 kubenswrapper[6932]: I0319 11:54:30.383794 6932 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bde080b-3820-463f-a27d-9fb9a7843d5d-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:30.383977 master-0 kubenswrapper[6932]: I0319 11:54:30.383875 6932 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6bde080b-3820-463f-a27d-9fb9a7843d5d-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:30.838719 master-0 kubenswrapper[6932]: I0319 11:54:30.838658 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"6bde080b-3820-463f-a27d-9fb9a7843d5d","Type":"ContainerDied","Data":"89d6b9652bfd68fb0b68a832373fa141222adae111524f0fd223064e1824cd6a"} Mar 19 11:54:30.838719 master-0 kubenswrapper[6932]: I0319 11:54:30.838722 6932 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89d6b9652bfd68fb0b68a832373fa141222adae111524f0fd223064e1824cd6a" Mar 19 11:54:30.838719 master-0 kubenswrapper[6932]: I0319 11:54:30.838677 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 19 11:54:30.840964 master-0 kubenswrapper[6932]: I0319 11:54:30.840917 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"294d8a4101a65cda21ec3874afc3a3c7bd30756c657037c09db78beaa20e4b9c"} Mar 19 11:54:30.842613 master-0 kubenswrapper[6932]: I0319 11:54:30.842579 6932 generic.go:334] "Generic (PLEG): container finished" podID="c83737980b9ee109184b1d78e942cf36" containerID="901ed10fec5e9417fcd7522a27f15f9a949e9c0dd2ab8e429fd9b30afd0247bf" exitCode=1 Mar 19 11:54:30.842675 master-0 kubenswrapper[6932]: I0319 11:54:30.842629 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerDied","Data":"901ed10fec5e9417fcd7522a27f15f9a949e9c0dd2ab8e429fd9b30afd0247bf"} Mar 19 11:54:30.842956 master-0 kubenswrapper[6932]: I0319 11:54:30.842925 6932 scope.go:117] "RemoveContainer" containerID="901ed10fec5e9417fcd7522a27f15f9a949e9c0dd2ab8e429fd9b30afd0247bf" Mar 19 11:54:31.774155 master-0 kubenswrapper[6932]: E0319 11:54:31.773988 6932 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T11:54:21Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T11:54:21Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T11:54:21Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T11:54:21Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\\\"],\\\"sizeBytes\\\":1637455533},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a55ec7ec64efd0f595d8084377b7e463a1807829b7617e5d4a9092dcd924c36\\\"],\\\"sizeBytes\\\":1238100502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:af0fe0ca926422a6471d5bf22fc0e682c36c24fba05496a3bdfac0b7d3733015\\\"],\\\"sizeBytes\\\":991832673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\"],\\\"sizeBytes\\\":943841779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0a09f5a3ba4f60cce0145769509bab92553c8075d210af4ac058965d2ae11efa\\\"],\\\"sizeBytes\\\":876160834},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a9e8da5c6114f062b814936d4db7a47a04d248e160d6bb28ad4e4a081496ee4\\\"],\\\"sizeBytes\\\":772943435},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1e1faad2d9167d84e23585c1cea5962301845548043cf09578f943f79ca98016\\\"],\\\"sizeBytes\\\":687949580},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5e12e4dc52214d3ada5ba5106caebe079eac1d9292c2571a5fe83411ce8e900d\\\"],\\\"sizeBytes\\\":683195416},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa5e782406f71c048b1ac3a4bf5d1227ff4be81111114083ad4c7a209c6bfb5a\\\"],\\\"sizeBytes\\\":677942383},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ec8fd46dfb35ed10e8f98933166f69ce579c2f35b8db03d21e4c34fc544553e4\\\"],\\\"sizeBytes\\\":621648710},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998\\\"],\\\"sizeBytes\\\":589386806},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55\\\"],\\\"sizeBytes\\\":582154903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982\\\"],\\\"sizeBytes\\\":558211175},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"],\\\"sizeBytes\\\":548752816},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\"],\\\"sizeBytes\\\":529326739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b4c9cf268bb7abef7af187cd775d3f74d0bd33626250095428d53b705ee946\\\"],\\\"sizeBytes\\\":528956487},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\"],\\\"sizeBytes\\\":518384969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:59727c4b3fef19e5149675cf3350735bbfe2c6588a57654b2e4552dd719f58b1\\\"],\\\"sizeBytes\\\":517999161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\"],\\\"sizeBytes\\\":514984269},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bfe394b58ec6195de8b8420e781b7630d85a412b9112d892fea903f92b783427\\\"],\\\"sizeBytes\\\":513221333},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"],\\\"sizeBytes\\\":512274055},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:77fff570657d2fa0bfb709b2c8b6665bae0bf90a2be981d8dbca56c674715098\\\"],\\\"sizeBytes\\\":511227324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adb9f6f2fd701863c7caed747df43f83d3569ba9388cfa33ea7219ac6a606b11\\\"],\\\"sizeBytes\\\":511164375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458\\\"],\\\"sizeBytes\\\":508888171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263\\\"],\\\"sizeBytes\\\":508544745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71\\\"],\\\"sizeBytes\\\":507972093},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3\\\"],\\\"sizeBytes\\\":506480167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\"],\\\"sizeBytes\\\":506395599},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7d4a034950346bcd4e36e9e2f1343e0cf7a10cf544963f33d09c7eb2a1bfc634\\\"],\\\"sizeBytes\\\":505345991},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\\\"],\\\"sizeBytes\\\":505246690},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\\\"],\\\"sizeBytes\\\":504625081},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69\\\"],\\\"sizeBytes\\\":495994673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023\\\"],\\\"sizeBytes\\\":495065340},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:446bedea4916d3c1ee52be94137e484659e9561bd1de95c8189eee279aae984b\\\"],\\\"sizeBytes\\\":487096305},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a746a87b784ea1caa278fd0e012554f9df520b6fff665ea0bc4c83f487fed113\\\"],\\\"sizeBytes\\\":484450894},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c4d5a681595e428ff4b5083648c13615eed80be9084a3d3fc68a0295079cb12\\\"],\\\"sizeBytes\\\":484187929},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ea5c8a93f30e0a4932da5697d22c0da7eda9a7035c0555eb006b6755e62bb2fc\\\"],\\\"sizeBytes\\\":468265024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\\\"],\\\"sizeBytes\\\":465090934},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9609c00207cc4db97f0fd6162eb429d7f81654137f020a677e30cba26a887a24\\\"],\\\"sizeBytes\\\":463705930},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:632e80bba5077068ecca05fddb95aedebad4493af6f36152c01c6ae490975b62\\\"],\\\"sizeBytes\\\":458126937},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bcb08821551e9a5b9f82aa794bcea673279cefb93cb47492e19ccac5e2cf18fe\\\"],\\\"sizeBytes\\\":456576198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:759fb1d5353dbbadd443f38631d977ca3aed9787b873be05cc9660532a252739\\\"],\\\"sizeBytes\\\":448828620},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3062f6485aec4770e60852b535c69a42527b305161fe856499c8658ead6d1e85\\\"],\\\"sizeBytes\\\":448042136},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:951ecfeba9b2da4b653034d09275f925396a79c2d8461b8a7c71c776fee67ba0\\\"],\\\"sizeBytes\\\":443272037},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:292560e2d80b460468bb19fe0ddf289767c655027b03a76ee6c40c91ffe4c483\\\"],\\\"sizeBytes\\\":438654374},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0e66fd50be6f83ce321a566dfb76f3725b597374077d5af13813b928f6b1267e\\\"],\\\"sizeBytes\\\":411587146},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e3a494212f1ba17f0f0980eef583218330eccb56eadf6b8cb0548c76d99b5014\\\"],\\\"sizeBytes\\\":407347125},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:53d66d524ca3e787d8dbe30dbc4d9b8612c9cebd505ccb4375a8441814e85422\\\"],\\\"sizeBytes\\\":396521761}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:54:32.148787 master-0 kubenswrapper[6932]: I0319 11:54:32.148581 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"a24c957c2955f33fcac616e1dace18be5248f20b6e9d2c791c70c17f3df96825"} Mar 19 11:54:37.170384 master-0 kubenswrapper[6932]: I0319 11:54:37.170304 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8c94f4649-6ghdm_e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf/openshift-controller-manager-operator/0.log" Mar 19 11:54:37.170384 master-0 kubenswrapper[6932]: I0319 11:54:37.170361 6932 generic.go:334] "Generic (PLEG): container finished" podID="e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf" containerID="3ab6a68db657d0e7924cc47a81bc9831d8055a58f93210e34c6ef5c5b5597505" exitCode=1 Mar 19 11:54:37.170384 master-0 kubenswrapper[6932]: I0319 11:54:37.170394 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" event={"ID":"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf","Type":"ContainerDied","Data":"3ab6a68db657d0e7924cc47a81bc9831d8055a58f93210e34c6ef5c5b5597505"} Mar 19 11:54:37.171262 master-0 kubenswrapper[6932]: I0319 11:54:37.170919 6932 scope.go:117] "RemoveContainer" containerID="3ab6a68db657d0e7924cc47a81bc9831d8055a58f93210e34c6ef5c5b5597505" Mar 19 11:54:38.152526 master-0 kubenswrapper[6932]: I0319 11:54:38.152452 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:38.179351 master-0 kubenswrapper[6932]: I0319 11:54:38.179304 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8c94f4649-6ghdm_e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf/openshift-controller-manager-operator/0.log" Mar 19 11:54:38.180046 master-0 kubenswrapper[6932]: I0319 11:54:38.179371 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" event={"ID":"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf","Type":"ContainerStarted","Data":"fd8c32d22caf0bf1b1f569479b7d959cb1e7f7190abe63f16601f2e5b50a0711"} Mar 19 11:54:38.542993 master-0 kubenswrapper[6932]: I0319 11:54:38.542862 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:54:38.542993 master-0 kubenswrapper[6932]: I0319 11:54:38.542930 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:54:38.542993 master-0 kubenswrapper[6932]: I0319 11:54:38.542877 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:54:38.543166 master-0 kubenswrapper[6932]: I0319 11:54:38.543022 6932 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:54:38.852573 master-0 kubenswrapper[6932]: I0319 11:54:38.852230 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:39.185663 master-0 kubenswrapper[6932]: I0319 11:54:39.185593 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_8e508a43-99db-49eb-bf4e-e3e6a0f49761/installer/0.log" Mar 19 11:54:39.185663 master-0 kubenswrapper[6932]: I0319 11:54:39.185643 6932 generic.go:334] "Generic (PLEG): container finished" podID="8e508a43-99db-49eb-bf4e-e3e6a0f49761" containerID="300261e39c3fe1898b1aa4629252d5e05f336f7f74bdf1250eea81121a460d42" exitCode=1 Mar 19 11:54:39.186286 master-0 kubenswrapper[6932]: I0319 11:54:39.185673 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"8e508a43-99db-49eb-bf4e-e3e6a0f49761","Type":"ContainerDied","Data":"300261e39c3fe1898b1aa4629252d5e05f336f7f74bdf1250eea81121a460d42"} Mar 19 11:54:39.908024 master-0 kubenswrapper[6932]: E0319 11:54:39.907973 6932 controller.go:195] "Failed to update lease" err="the server was unable to return a response in the time allotted, but may still be processing the request (put leases.coordination.k8s.io master-0)" Mar 19 11:54:40.429089 master-0 kubenswrapper[6932]: I0319 11:54:40.429035 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_8e508a43-99db-49eb-bf4e-e3e6a0f49761/installer/0.log" Mar 19 11:54:40.429418 master-0 kubenswrapper[6932]: I0319 11:54:40.429150 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:54:40.532142 master-0 kubenswrapper[6932]: I0319 11:54:40.531969 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e508a43-99db-49eb-bf4e-e3e6a0f49761-var-lock\") pod \"8e508a43-99db-49eb-bf4e-e3e6a0f49761\" (UID: \"8e508a43-99db-49eb-bf4e-e3e6a0f49761\") " Mar 19 11:54:40.532142 master-0 kubenswrapper[6932]: I0319 11:54:40.532071 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e508a43-99db-49eb-bf4e-e3e6a0f49761-kube-api-access\") pod \"8e508a43-99db-49eb-bf4e-e3e6a0f49761\" (UID: \"8e508a43-99db-49eb-bf4e-e3e6a0f49761\") " Mar 19 11:54:40.532403 master-0 kubenswrapper[6932]: I0319 11:54:40.532155 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e508a43-99db-49eb-bf4e-e3e6a0f49761-kubelet-dir\") pod \"8e508a43-99db-49eb-bf4e-e3e6a0f49761\" (UID: \"8e508a43-99db-49eb-bf4e-e3e6a0f49761\") " Mar 19 11:54:40.532403 master-0 kubenswrapper[6932]: I0319 11:54:40.532173 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e508a43-99db-49eb-bf4e-e3e6a0f49761-var-lock" (OuterVolumeSpecName: "var-lock") pod "8e508a43-99db-49eb-bf4e-e3e6a0f49761" (UID: "8e508a43-99db-49eb-bf4e-e3e6a0f49761"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:54:40.532403 master-0 kubenswrapper[6932]: I0319 11:54:40.532289 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e508a43-99db-49eb-bf4e-e3e6a0f49761-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8e508a43-99db-49eb-bf4e-e3e6a0f49761" (UID: "8e508a43-99db-49eb-bf4e-e3e6a0f49761"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:54:40.532639 master-0 kubenswrapper[6932]: I0319 11:54:40.532582 6932 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e508a43-99db-49eb-bf4e-e3e6a0f49761-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:40.532639 master-0 kubenswrapper[6932]: I0319 11:54:40.532627 6932 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e508a43-99db-49eb-bf4e-e3e6a0f49761-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:40.535186 master-0 kubenswrapper[6932]: I0319 11:54:40.535134 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e508a43-99db-49eb-bf4e-e3e6a0f49761-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8e508a43-99db-49eb-bf4e-e3e6a0f49761" (UID: "8e508a43-99db-49eb-bf4e-e3e6a0f49761"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:54:40.634195 master-0 kubenswrapper[6932]: I0319 11:54:40.634124 6932 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e508a43-99db-49eb-bf4e-e3e6a0f49761-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:41.197478 master-0 kubenswrapper[6932]: I0319 11:54:41.197426 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_8e508a43-99db-49eb-bf4e-e3e6a0f49761/installer/0.log" Mar 19 11:54:41.197806 master-0 kubenswrapper[6932]: I0319 11:54:41.197564 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"8e508a43-99db-49eb-bf4e-e3e6a0f49761","Type":"ContainerDied","Data":"9f8ce6740e029884a042c5c80751b9fc216004e3a3d167dbdc2e5cb2a86f8183"} Mar 19 11:54:41.197806 master-0 kubenswrapper[6932]: I0319 11:54:41.197582 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:54:41.197806 master-0 kubenswrapper[6932]: I0319 11:54:41.197603 6932 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f8ce6740e029884a042c5c80751b9fc216004e3a3d167dbdc2e5cb2a86f8183" Mar 19 11:54:41.543371 master-0 kubenswrapper[6932]: I0319 11:54:41.543163 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:54:41.543371 master-0 kubenswrapper[6932]: I0319 11:54:41.543246 6932 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:54:41.543371 master-0 kubenswrapper[6932]: I0319 11:54:41.543263 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:54:41.544279 master-0 kubenswrapper[6932]: I0319 11:54:41.543381 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:54:41.774634 master-0 kubenswrapper[6932]: E0319 11:54:41.774552 6932 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:54:41.824051 master-0 kubenswrapper[6932]: E0319 11:54:41.823941 6932 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 19 11:54:41.853400 master-0 kubenswrapper[6932]: I0319 11:54:41.853322 6932 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 11:54:43.211136 master-0 kubenswrapper[6932]: I0319 11:54:43.211057 6932 generic.go:334] "Generic (PLEG): container finished" podID="d664a6d0d2a24360dee10612610f1b59" containerID="863c1805db648a1c68dab43606fe7bf357e1d14504af2989916cb369fe922861" exitCode=0 Mar 19 11:54:43.213067 master-0 kubenswrapper[6932]: I0319 11:54:43.213036 6932 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="a9417d06413f157e4d35a2d3d830254ff285bb6abccccf700d17496320ba4ec0" exitCode=0 Mar 19 11:54:43.213143 master-0 kubenswrapper[6932]: I0319 11:54:43.213065 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"a9417d06413f157e4d35a2d3d830254ff285bb6abccccf700d17496320ba4ec0"} Mar 19 11:54:44.543319 master-0 kubenswrapper[6932]: I0319 11:54:44.543240 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:54:44.543911 master-0 kubenswrapper[6932]: I0319 11:54:44.543303 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:54:44.543911 master-0 kubenswrapper[6932]: I0319 11:54:44.543328 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:54:44.543911 master-0 kubenswrapper[6932]: I0319 11:54:44.543401 6932 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:54:44.543911 master-0 kubenswrapper[6932]: I0319 11:54:44.543550 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:54:44.543911 master-0 kubenswrapper[6932]: I0319 11:54:44.543608 6932 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:54:44.544160 master-0 kubenswrapper[6932]: I0319 11:54:44.544117 6932 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"c2c2b96a5faf69402dfe85ec6b2718eb42ca2ecf78927fa96ef82a61fc3c2da6"} pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Mar 19 11:54:44.544208 master-0 kubenswrapper[6932]: I0319 11:54:44.544176 6932 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" containerID="cri-o://c2c2b96a5faf69402dfe85ec6b2718eb42ca2ecf78927fa96ef82a61fc3c2da6" gracePeriod=30 Mar 19 11:54:44.544240 master-0 kubenswrapper[6932]: I0319 11:54:44.544207 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:54:44.544276 master-0 kubenswrapper[6932]: I0319 11:54:44.544236 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:54:44.845405 master-0 kubenswrapper[6932]: I0319 11:54:44.845359 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_d664a6d0d2a24360dee10612610f1b59/etcdctl/0.log" Mar 19 11:54:44.845557 master-0 kubenswrapper[6932]: I0319 11:54:44.845460 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:54:44.990349 master-0 kubenswrapper[6932]: I0319 11:54:44.990256 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"d664a6d0d2a24360dee10612610f1b59\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " Mar 19 11:54:44.990349 master-0 kubenswrapper[6932]: I0319 11:54:44.990346 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"d664a6d0d2a24360dee10612610f1b59\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " Mar 19 11:54:44.990695 master-0 kubenswrapper[6932]: I0319 11:54:44.990427 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs" (OuterVolumeSpecName: "certs") pod "d664a6d0d2a24360dee10612610f1b59" (UID: "d664a6d0d2a24360dee10612610f1b59"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:54:44.990695 master-0 kubenswrapper[6932]: I0319 11:54:44.990557 6932 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:44.990695 master-0 kubenswrapper[6932]: I0319 11:54:44.990569 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir" (OuterVolumeSpecName: "data-dir") pod "d664a6d0d2a24360dee10612610f1b59" (UID: "d664a6d0d2a24360dee10612610f1b59"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:54:45.092248 master-0 kubenswrapper[6932]: I0319 11:54:45.092080 6932 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:45.228111 master-0 kubenswrapper[6932]: I0319 11:54:45.228082 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_d664a6d0d2a24360dee10612610f1b59/etcdctl/0.log" Mar 19 11:54:45.228393 master-0 kubenswrapper[6932]: I0319 11:54:45.228372 6932 generic.go:334] "Generic (PLEG): container finished" podID="d664a6d0d2a24360dee10612610f1b59" containerID="b9695d95c55ced36d31ee4b3802610d675e3206471662e3165ad086a92a3332c" exitCode=137 Mar 19 11:54:45.228484 master-0 kubenswrapper[6932]: I0319 11:54:45.228473 6932 scope.go:117] "RemoveContainer" containerID="863c1805db648a1c68dab43606fe7bf357e1d14504af2989916cb369fe922861" Mar 19 11:54:45.228567 master-0 kubenswrapper[6932]: I0319 11:54:45.228521 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:54:45.241934 master-0 kubenswrapper[6932]: I0319 11:54:45.241876 6932 scope.go:117] "RemoveContainer" containerID="b9695d95c55ced36d31ee4b3802610d675e3206471662e3165ad086a92a3332c" Mar 19 11:54:45.257877 master-0 kubenswrapper[6932]: I0319 11:54:45.257793 6932 scope.go:117] "RemoveContainer" containerID="863c1805db648a1c68dab43606fe7bf357e1d14504af2989916cb369fe922861" Mar 19 11:54:45.258398 master-0 kubenswrapper[6932]: E0319 11:54:45.258344 6932 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"863c1805db648a1c68dab43606fe7bf357e1d14504af2989916cb369fe922861\": container with ID starting with 863c1805db648a1c68dab43606fe7bf357e1d14504af2989916cb369fe922861 not found: ID does not exist" containerID="863c1805db648a1c68dab43606fe7bf357e1d14504af2989916cb369fe922861" Mar 19 11:54:45.258398 master-0 kubenswrapper[6932]: I0319 11:54:45.258384 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"863c1805db648a1c68dab43606fe7bf357e1d14504af2989916cb369fe922861"} err="failed to get container status \"863c1805db648a1c68dab43606fe7bf357e1d14504af2989916cb369fe922861\": rpc error: code = NotFound desc = could not find container \"863c1805db648a1c68dab43606fe7bf357e1d14504af2989916cb369fe922861\": container with ID starting with 863c1805db648a1c68dab43606fe7bf357e1d14504af2989916cb369fe922861 not found: ID does not exist" Mar 19 11:54:45.258583 master-0 kubenswrapper[6932]: I0319 11:54:45.258409 6932 scope.go:117] "RemoveContainer" containerID="b9695d95c55ced36d31ee4b3802610d675e3206471662e3165ad086a92a3332c" Mar 19 11:54:45.258814 master-0 kubenswrapper[6932]: E0319 11:54:45.258747 6932 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9695d95c55ced36d31ee4b3802610d675e3206471662e3165ad086a92a3332c\": container with ID starting with b9695d95c55ced36d31ee4b3802610d675e3206471662e3165ad086a92a3332c not found: ID does not exist" containerID="b9695d95c55ced36d31ee4b3802610d675e3206471662e3165ad086a92a3332c" Mar 19 11:54:45.258893 master-0 kubenswrapper[6932]: I0319 11:54:45.258801 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9695d95c55ced36d31ee4b3802610d675e3206471662e3165ad086a92a3332c"} err="failed to get container status \"b9695d95c55ced36d31ee4b3802610d675e3206471662e3165ad086a92a3332c\": rpc error: code = NotFound desc = could not find container \"b9695d95c55ced36d31ee4b3802610d675e3206471662e3165ad086a92a3332c\": container with ID starting with b9695d95c55ced36d31ee4b3802610d675e3206471662e3165ad086a92a3332c not found: ID does not exist" Mar 19 11:54:45.879587 master-0 kubenswrapper[6932]: I0319 11:54:45.879523 6932 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d664a6d0d2a24360dee10612610f1b59" path="/var/lib/kubelet/pods/d664a6d0d2a24360dee10612610f1b59/volumes" Mar 19 11:54:45.880580 master-0 kubenswrapper[6932]: I0319 11:54:45.880063 6932 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 19 11:54:47.543765 master-0 kubenswrapper[6932]: I0319 11:54:47.543681 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:54:47.544416 master-0 kubenswrapper[6932]: I0319 11:54:47.543811 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:54:49.164120 master-0 kubenswrapper[6932]: E0319 11:54:49.163869 6932 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e3bfb889256a2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:54:14.744676002 +0000 UTC m=+79.103736234,LastTimestamp:2026-03-19 11:54:14.744676002 +0000 UTC m=+79.103736234,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:54:49.397804 master-0 kubenswrapper[6932]: E0319 11:54:49.397741 6932 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 19 11:54:49.397804 master-0 kubenswrapper[6932]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-rrvxk_openshift-marketplace_64f5cbf1-f761-4531-8e5c-1f9b318b0cb9_0(db04621d350e29119d61db6c762956e7b0098a8bfc1d8036948fd67dc08ef115): error adding pod openshift-marketplace_community-operators-rrvxk to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"db04621d350e29119d61db6c762956e7b0098a8bfc1d8036948fd67dc08ef115" Netns:"/var/run/netns/1fa4a10d-bd1f-4141-a777-b2cdb93f7034" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-rrvxk;K8S_POD_INFRA_CONTAINER_ID=db04621d350e29119d61db6c762956e7b0098a8bfc1d8036948fd67dc08ef115;K8S_POD_UID=64f5cbf1-f761-4531-8e5c-1f9b318b0cb9" Path:"" ERRORED: error configuring pod [openshift-marketplace/community-operators-rrvxk] networking: Multus: [openshift-marketplace/community-operators-rrvxk/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-rrvxk in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-rrvxk in out of cluster comm: status update failed for pod openshift-marketplace/community-operators-rrvxk: Timeout: request did not complete within requested timeout - context deadline exceeded Mar 19 11:54:49.397804 master-0 kubenswrapper[6932]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 11:54:49.397804 master-0 kubenswrapper[6932]: > Mar 19 11:54:49.397979 master-0 kubenswrapper[6932]: E0319 11:54:49.397831 6932 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 19 11:54:49.397979 master-0 kubenswrapper[6932]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-rrvxk_openshift-marketplace_64f5cbf1-f761-4531-8e5c-1f9b318b0cb9_0(db04621d350e29119d61db6c762956e7b0098a8bfc1d8036948fd67dc08ef115): error adding pod openshift-marketplace_community-operators-rrvxk to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"db04621d350e29119d61db6c762956e7b0098a8bfc1d8036948fd67dc08ef115" Netns:"/var/run/netns/1fa4a10d-bd1f-4141-a777-b2cdb93f7034" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-rrvxk;K8S_POD_INFRA_CONTAINER_ID=db04621d350e29119d61db6c762956e7b0098a8bfc1d8036948fd67dc08ef115;K8S_POD_UID=64f5cbf1-f761-4531-8e5c-1f9b318b0cb9" Path:"" ERRORED: error configuring pod [openshift-marketplace/community-operators-rrvxk] networking: Multus: [openshift-marketplace/community-operators-rrvxk/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-rrvxk in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-rrvxk in out of cluster comm: status update failed for pod openshift-marketplace/community-operators-rrvxk: Timeout: request did not complete within requested timeout - context deadline exceeded Mar 19 11:54:49.397979 master-0 kubenswrapper[6932]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 11:54:49.397979 master-0 kubenswrapper[6932]: > pod="openshift-marketplace/community-operators-rrvxk" Mar 19 11:54:49.397979 master-0 kubenswrapper[6932]: E0319 11:54:49.397855 6932 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 19 11:54:49.397979 master-0 kubenswrapper[6932]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-rrvxk_openshift-marketplace_64f5cbf1-f761-4531-8e5c-1f9b318b0cb9_0(db04621d350e29119d61db6c762956e7b0098a8bfc1d8036948fd67dc08ef115): error adding pod openshift-marketplace_community-operators-rrvxk to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"db04621d350e29119d61db6c762956e7b0098a8bfc1d8036948fd67dc08ef115" Netns:"/var/run/netns/1fa4a10d-bd1f-4141-a777-b2cdb93f7034" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-rrvxk;K8S_POD_INFRA_CONTAINER_ID=db04621d350e29119d61db6c762956e7b0098a8bfc1d8036948fd67dc08ef115;K8S_POD_UID=64f5cbf1-f761-4531-8e5c-1f9b318b0cb9" Path:"" ERRORED: error configuring pod [openshift-marketplace/community-operators-rrvxk] networking: Multus: [openshift-marketplace/community-operators-rrvxk/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-rrvxk in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-rrvxk in out of cluster comm: status update failed for pod openshift-marketplace/community-operators-rrvxk: Timeout: request did not complete within requested timeout - context deadline exceeded Mar 19 11:54:49.397979 master-0 kubenswrapper[6932]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 11:54:49.397979 master-0 kubenswrapper[6932]: > pod="openshift-marketplace/community-operators-rrvxk" Mar 19 11:54:49.397979 master-0 kubenswrapper[6932]: E0319 11:54:49.397918 6932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"community-operators-rrvxk_openshift-marketplace(64f5cbf1-f761-4531-8e5c-1f9b318b0cb9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"community-operators-rrvxk_openshift-marketplace(64f5cbf1-f761-4531-8e5c-1f9b318b0cb9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-rrvxk_openshift-marketplace_64f5cbf1-f761-4531-8e5c-1f9b318b0cb9_0(db04621d350e29119d61db6c762956e7b0098a8bfc1d8036948fd67dc08ef115): error adding pod openshift-marketplace_community-operators-rrvxk to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"db04621d350e29119d61db6c762956e7b0098a8bfc1d8036948fd67dc08ef115\\\" Netns:\\\"/var/run/netns/1fa4a10d-bd1f-4141-a777-b2cdb93f7034\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-rrvxk;K8S_POD_INFRA_CONTAINER_ID=db04621d350e29119d61db6c762956e7b0098a8bfc1d8036948fd67dc08ef115;K8S_POD_UID=64f5cbf1-f761-4531-8e5c-1f9b318b0cb9\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/community-operators-rrvxk] networking: Multus: [openshift-marketplace/community-operators-rrvxk/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-rrvxk in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-rrvxk in out of cluster comm: status update failed for pod openshift-marketplace/community-operators-rrvxk: Timeout: request did not complete within requested timeout - context deadline exceeded\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/community-operators-rrvxk" podUID="64f5cbf1-f761-4531-8e5c-1f9b318b0cb9" Mar 19 11:54:49.909190 master-0 kubenswrapper[6932]: E0319 11:54:49.909117 6932 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:54:50.261377 master-0 kubenswrapper[6932]: I0319 11:54:50.261335 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rrvxk" Mar 19 11:54:50.261916 master-0 kubenswrapper[6932]: I0319 11:54:50.261712 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rrvxk" Mar 19 11:54:50.543378 master-0 kubenswrapper[6932]: I0319 11:54:50.543250 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:54:50.543378 master-0 kubenswrapper[6932]: I0319 11:54:50.543315 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:54:51.775319 master-0 kubenswrapper[6932]: E0319 11:54:51.775226 6932 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:54:51.853180 master-0 kubenswrapper[6932]: I0319 11:54:51.853065 6932 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 11:54:53.542778 master-0 kubenswrapper[6932]: I0319 11:54:53.542722 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:54:53.543653 master-0 kubenswrapper[6932]: I0319 11:54:53.543624 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:54:56.301401 master-0 kubenswrapper[6932]: I0319 11:54:56.301284 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7bd846bfc4-7fz6w_3c3b0d24-ce5e-49c3-a546-874356f75dc6/network-operator/0.log" Mar 19 11:54:56.301401 master-0 kubenswrapper[6932]: I0319 11:54:56.301363 6932 generic.go:334] "Generic (PLEG): container finished" podID="3c3b0d24-ce5e-49c3-a546-874356f75dc6" containerID="a35a4f30770261f78e16c8cbde80e6ad1d01d59985d717446c5cf700c3ca0a3e" exitCode=255 Mar 19 11:54:56.543775 master-0 kubenswrapper[6932]: I0319 11:54:56.543644 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:54:56.543775 master-0 kubenswrapper[6932]: I0319 11:54:56.543768 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:54:57.224275 master-0 kubenswrapper[6932]: E0319 11:54:57.224066 6932 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 19 11:54:57.417007 master-0 kubenswrapper[6932]: I0319 11:54:57.416943 6932 patch_prober.go:28] interesting pod/etcd-operator-8544cbcf9c-9w7hc container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.23:8443/healthz\": dial tcp 10.128.0.23:8443: connect: connection refused" start-of-body= Mar 19 11:54:57.417007 master-0 kubenswrapper[6932]: I0319 11:54:57.417000 6932 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" podUID="8fe4839d-cef4-4ec9-b146-2ae9b76d8a76" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.23:8443/healthz\": dial tcp 10.128.0.23:8443: connect: connection refused" Mar 19 11:54:58.313144 master-0 kubenswrapper[6932]: I0319 11:54:58.313065 6932 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="b76fef7000b310af498f0cffcb969b0c47b51465c0a751707ee0c2ff2e63eba3" exitCode=0 Mar 19 11:54:59.543152 master-0 kubenswrapper[6932]: I0319 11:54:59.543076 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:54:59.543152 master-0 kubenswrapper[6932]: I0319 11:54:59.543147 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:54:59.911123 master-0 kubenswrapper[6932]: E0319 11:54:59.910951 6932 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:55:01.330546 master-0 kubenswrapper[6932]: I0319 11:55:01.330467 6932 generic.go:334] "Generic (PLEG): container finished" podID="8fe4839d-cef4-4ec9-b146-2ae9b76d8a76" containerID="b1921d5234eb4af4d7731c20be87a9595434841b33d272f8f2c3ade584fe4c62" exitCode=0 Mar 19 11:55:01.776014 master-0 kubenswrapper[6932]: E0319 11:55:01.775912 6932 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:55:01.853006 master-0 kubenswrapper[6932]: I0319 11:55:01.852831 6932 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 11:55:02.340498 master-0 kubenswrapper[6932]: I0319 11:55:02.340439 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-j528w_8438d015-106b-4aed-ae12-dda781ce51fc/approver/0.log" Mar 19 11:55:02.341153 master-0 kubenswrapper[6932]: I0319 11:55:02.341034 6932 generic.go:334] "Generic (PLEG): container finished" podID="8438d015-106b-4aed-ae12-dda781ce51fc" containerID="27aeacdf42166ebdfe7943145673659894eb1a05c94251adf45a06c9d05c04a8" exitCode=1 Mar 19 11:55:02.543615 master-0 kubenswrapper[6932]: I0319 11:55:02.543569 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:55:02.543998 master-0 kubenswrapper[6932]: I0319 11:55:02.543964 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:55:05.543215 master-0 kubenswrapper[6932]: I0319 11:55:05.543120 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:55:05.543215 master-0 kubenswrapper[6932]: I0319 11:55:05.543194 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:55:07.365981 master-0 kubenswrapper[6932]: I0319 11:55:07.365908 6932 generic.go:334] "Generic (PLEG): container finished" podID="66f88242-8b0b-4790-bbb6-445c19b34ee7" containerID="f48ebfe02dc1f93683f1d2eea873f5d0c2c3081e3483e2d09faebd411fa396ef" exitCode=0 Mar 19 11:55:08.543359 master-0 kubenswrapper[6932]: I0319 11:55:08.543229 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:55:08.543359 master-0 kubenswrapper[6932]: I0319 11:55:08.543325 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:55:09.911562 master-0 kubenswrapper[6932]: E0319 11:55:09.911473 6932 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:55:09.911562 master-0 kubenswrapper[6932]: I0319 11:55:09.911549 6932 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 11:55:10.381609 master-0 kubenswrapper[6932]: I0319 11:55:10.381562 6932 generic.go:334] "Generic (PLEG): container finished" podID="f5d73fef-1414-4b29-97ea-42e1c0b1ef18" containerID="a00e4976297d868e9d1a74ee69351e1ac6225f1b3fff400804a95076bf8deddd" exitCode=0 Mar 19 11:55:11.543888 master-0 kubenswrapper[6932]: I0319 11:55:11.543806 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:55:11.544516 master-0 kubenswrapper[6932]: I0319 11:55:11.543907 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:55:11.776462 master-0 kubenswrapper[6932]: E0319 11:55:11.776376 6932 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:55:11.776462 master-0 kubenswrapper[6932]: E0319 11:55:11.776419 6932 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 11:55:12.393328 master-0 kubenswrapper[6932]: I0319 11:55:12.393155 6932 generic.go:334] "Generic (PLEG): container finished" podID="b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d" containerID="ffd01994498e412e963b01ac06f0e6ad28082a18471897dde077305cc7888366" exitCode=0 Mar 19 11:55:12.395402 master-0 kubenswrapper[6932]: I0319 11:55:12.395359 6932 generic.go:334] "Generic (PLEG): container finished" podID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerID="c2c2b96a5faf69402dfe85ec6b2718eb42ca2ecf78927fa96ef82a61fc3c2da6" exitCode=0 Mar 19 11:55:15.169490 master-0 kubenswrapper[6932]: I0319 11:55:15.169395 6932 status_manager.go:851] "Failed to get status for pod" podUID="96498b3d-c93f-4b42-a0aa-2afec3450b1d" pod="openshift-kube-scheduler/installer-1-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-1-master-0)" Mar 19 11:55:15.410647 master-0 kubenswrapper[6932]: I0319 11:55:15.410501 6932 generic.go:334] "Generic (PLEG): container finished" podID="9b61ea14-a7ea-49f3-9df4-5655765ddf7c" containerID="a63fe33504bcc71f9b4e0c9d251065dc432b3176905c1514b755fad213c3ed25" exitCode=0 Mar 19 11:55:15.972102 master-0 kubenswrapper[6932]: E0319 11:55:15.972049 6932 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 19 11:55:15.972102 master-0 kubenswrapper[6932]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-pn57d_openshift-marketplace_ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c_0(de4fdf96a23a0bb3097271164c9c3e228ac64981443c9c0fcbd76eef8b1fe8fe): error adding pod openshift-marketplace_certified-operators-pn57d to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"de4fdf96a23a0bb3097271164c9c3e228ac64981443c9c0fcbd76eef8b1fe8fe" Netns:"/var/run/netns/f7ff9b64-9543-4510-9b68-cd1845e2c68c" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-pn57d;K8S_POD_INFRA_CONTAINER_ID=de4fdf96a23a0bb3097271164c9c3e228ac64981443c9c0fcbd76eef8b1fe8fe;K8S_POD_UID=ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c" Path:"" ERRORED: error configuring pod [openshift-marketplace/certified-operators-pn57d] networking: Multus: [openshift-marketplace/certified-operators-pn57d/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-pn57d in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-pn57d in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pn57d?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 11:55:15.972102 master-0 kubenswrapper[6932]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 11:55:15.972102 master-0 kubenswrapper[6932]: > Mar 19 11:55:15.972266 master-0 kubenswrapper[6932]: E0319 11:55:15.972126 6932 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 19 11:55:15.972266 master-0 kubenswrapper[6932]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-pn57d_openshift-marketplace_ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c_0(de4fdf96a23a0bb3097271164c9c3e228ac64981443c9c0fcbd76eef8b1fe8fe): error adding pod openshift-marketplace_certified-operators-pn57d to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"de4fdf96a23a0bb3097271164c9c3e228ac64981443c9c0fcbd76eef8b1fe8fe" Netns:"/var/run/netns/f7ff9b64-9543-4510-9b68-cd1845e2c68c" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-pn57d;K8S_POD_INFRA_CONTAINER_ID=de4fdf96a23a0bb3097271164c9c3e228ac64981443c9c0fcbd76eef8b1fe8fe;K8S_POD_UID=ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c" Path:"" ERRORED: error configuring pod [openshift-marketplace/certified-operators-pn57d] networking: Multus: [openshift-marketplace/certified-operators-pn57d/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-pn57d in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-pn57d in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pn57d?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 11:55:15.972266 master-0 kubenswrapper[6932]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 11:55:15.972266 master-0 kubenswrapper[6932]: > pod="openshift-marketplace/certified-operators-pn57d" Mar 19 11:55:15.972266 master-0 kubenswrapper[6932]: E0319 11:55:15.972151 6932 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 19 11:55:15.972266 master-0 kubenswrapper[6932]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-pn57d_openshift-marketplace_ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c_0(de4fdf96a23a0bb3097271164c9c3e228ac64981443c9c0fcbd76eef8b1fe8fe): error adding pod openshift-marketplace_certified-operators-pn57d to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"de4fdf96a23a0bb3097271164c9c3e228ac64981443c9c0fcbd76eef8b1fe8fe" Netns:"/var/run/netns/f7ff9b64-9543-4510-9b68-cd1845e2c68c" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-pn57d;K8S_POD_INFRA_CONTAINER_ID=de4fdf96a23a0bb3097271164c9c3e228ac64981443c9c0fcbd76eef8b1fe8fe;K8S_POD_UID=ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c" Path:"" ERRORED: error configuring pod [openshift-marketplace/certified-operators-pn57d] networking: Multus: [openshift-marketplace/certified-operators-pn57d/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-pn57d in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-pn57d in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pn57d?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 11:55:15.972266 master-0 kubenswrapper[6932]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 11:55:15.972266 master-0 kubenswrapper[6932]: > pod="openshift-marketplace/certified-operators-pn57d" Mar 19 11:55:15.972266 master-0 kubenswrapper[6932]: E0319 11:55:15.972208 6932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"certified-operators-pn57d_openshift-marketplace(ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"certified-operators-pn57d_openshift-marketplace(ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-pn57d_openshift-marketplace_ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c_0(de4fdf96a23a0bb3097271164c9c3e228ac64981443c9c0fcbd76eef8b1fe8fe): error adding pod openshift-marketplace_certified-operators-pn57d to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"de4fdf96a23a0bb3097271164c9c3e228ac64981443c9c0fcbd76eef8b1fe8fe\\\" Netns:\\\"/var/run/netns/f7ff9b64-9543-4510-9b68-cd1845e2c68c\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-pn57d;K8S_POD_INFRA_CONTAINER_ID=de4fdf96a23a0bb3097271164c9c3e228ac64981443c9c0fcbd76eef8b1fe8fe;K8S_POD_UID=ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/certified-operators-pn57d] networking: Multus: [openshift-marketplace/certified-operators-pn57d/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-pn57d in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-pn57d in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pn57d?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/certified-operators-pn57d" podUID="ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c" Mar 19 11:55:16.416257 master-0 kubenswrapper[6932]: I0319 11:55:16.416196 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pn57d" Mar 19 11:55:16.416911 master-0 kubenswrapper[6932]: I0319 11:55:16.416640 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pn57d" Mar 19 11:55:17.422318 master-0 kubenswrapper[6932]: I0319 11:55:17.422264 6932 generic.go:334] "Generic (PLEG): container finished" podID="dbcbba74-ac53-4724-a217-4d9b85e7c1db" containerID="b6e56f4e0942ab58cf693081930c0b921d6a49180ecc1e1f47356ba56a945538" exitCode=0 Mar 19 11:55:17.543333 master-0 kubenswrapper[6932]: I0319 11:55:17.543240 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:55:17.543333 master-0 kubenswrapper[6932]: I0319 11:55:17.543273 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:55:17.543333 master-0 kubenswrapper[6932]: I0319 11:55:17.543318 6932 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:55:17.543801 master-0 kubenswrapper[6932]: I0319 11:55:17.543353 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:55:19.882521 master-0 kubenswrapper[6932]: E0319 11:55:19.882426 6932 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:55:19.883691 master-0 kubenswrapper[6932]: E0319 11:55:19.882654 6932 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.012s" Mar 19 11:55:19.883691 master-0 kubenswrapper[6932]: I0319 11:55:19.882710 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:55:19.891352 master-0 kubenswrapper[6932]: I0319 11:55:19.891260 6932 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 19 11:55:19.912334 master-0 kubenswrapper[6932]: E0319 11:55:19.912244 6932 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Mar 19 11:55:20.442102 master-0 kubenswrapper[6932]: I0319 11:55:20.442022 6932 generic.go:334] "Generic (PLEG): container finished" podID="6611e325-6152-480c-9c2c-1b503e49ccd2" containerID="5b56b51126590bf802dd88d10f125adb62528aa19311215ff5bc2461894ca90f" exitCode=0 Mar 19 11:55:20.542868 master-0 kubenswrapper[6932]: I0319 11:55:20.542791 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:55:20.543077 master-0 kubenswrapper[6932]: I0319 11:55:20.542879 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:55:20.543194 master-0 kubenswrapper[6932]: I0319 11:55:20.543105 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:55:20.543424 master-0 kubenswrapper[6932]: I0319 11:55:20.543281 6932 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:55:23.167782 master-0 kubenswrapper[6932]: E0319 11:55:23.167484 6932 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{control-plane-machine-set-operator-6f97756bc8-j7rc9.189e3bfba239e472 openshift-machine-api 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-api,Name:control-plane-machine-set-operator-6f97756bc8-j7rc9,UID:7a51eeaf-1349-4bf3-932d-22ed5ce7c161,APIVersion:v1,ResourceVersion:8902,FieldPath:spec.containers{control-plane-machine-set-operator},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:908eaaf624959bc7645f6d585d160431d1efb070e9a1f37fefed73a3be42b0d3\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:54:15.175087218 +0000 UTC m=+79.534147440,LastTimestamp:2026-03-19 11:54:15.175087218 +0000 UTC m=+79.534147440,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:55:23.543720 master-0 kubenswrapper[6932]: I0319 11:55:23.543636 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:55:23.543720 master-0 kubenswrapper[6932]: I0319 11:55:23.543676 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:55:23.544092 master-0 kubenswrapper[6932]: I0319 11:55:23.543784 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:55:23.544092 master-0 kubenswrapper[6932]: I0319 11:55:23.543782 6932 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:55:26.543517 master-0 kubenswrapper[6932]: I0319 11:55:26.543402 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:55:26.544273 master-0 kubenswrapper[6932]: I0319 11:55:26.543525 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:55:29.543348 master-0 kubenswrapper[6932]: I0319 11:55:29.543292 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:55:29.543348 master-0 kubenswrapper[6932]: I0319 11:55:29.543341 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:55:30.113656 master-0 kubenswrapper[6932]: E0319 11:55:30.113559 6932 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" interval="400ms" Mar 19 11:55:31.506800 master-0 kubenswrapper[6932]: I0319 11:55:31.506708 6932 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="294d8a4101a65cda21ec3874afc3a3c7bd30756c657037c09db78beaa20e4b9c" exitCode=1 Mar 19 11:55:32.049262 master-0 kubenswrapper[6932]: E0319 11:55:32.049002 6932 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T11:55:22Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T11:55:22Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T11:55:22Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T11:55:22Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\\\"],\\\"sizeBytes\\\":1637455533},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a55ec7ec64efd0f595d8084377b7e463a1807829b7617e5d4a9092dcd924c36\\\"],\\\"sizeBytes\\\":1238100502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:af0fe0ca926422a6471d5bf22fc0e682c36c24fba05496a3bdfac0b7d3733015\\\"],\\\"sizeBytes\\\":991832673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\"],\\\"sizeBytes\\\":943841779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0a09f5a3ba4f60cce0145769509bab92553c8075d210af4ac058965d2ae11efa\\\"],\\\"sizeBytes\\\":876160834},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6bba3f73c0066e42b24839e0d29f5dce2f36436f0a11f9f5e1029bccc5ed6578\\\"],\\\"sizeBytes\\\":862657321},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a9e8da5c6114f062b814936d4db7a47a04d248e160d6bb28ad4e4a081496ee4\\\"],\\\"sizeBytes\\\":772943435},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1e1faad2d9167d84e23585c1cea5962301845548043cf09578f943f79ca98016\\\"],\\\"sizeBytes\\\":687949580},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5e12e4dc52214d3ada5ba5106caebe079eac1d9292c2571a5fe83411ce8e900d\\\"],\\\"sizeBytes\\\":683195416},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa5e782406f71c048b1ac3a4bf5d1227ff4be81111114083ad4c7a209c6bfb5a\\\"],\\\"sizeBytes\\\":677942383},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ec8fd46dfb35ed10e8f98933166f69ce579c2f35b8db03d21e4c34fc544553e4\\\"],\\\"sizeBytes\\\":621648710},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998\\\"],\\\"sizeBytes\\\":589386806},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55\\\"],\\\"sizeBytes\\\":582154903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982\\\"],\\\"sizeBytes\\\":558211175},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"],\\\"sizeBytes\\\":548752816},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\"],\\\"sizeBytes\\\":529326739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b4c9cf268bb7abef7af187cd775d3f74d0bd33626250095428d53b705ee946\\\"],\\\"sizeBytes\\\":528956487},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\"],\\\"sizeBytes\\\":518384969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:59727c4b3fef19e5149675cf3350735bbfe2c6588a57654b2e4552dd719f58b1\\\"],\\\"sizeBytes\\\":517999161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\"],\\\"sizeBytes\\\":514984269},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bfe394b58ec6195de8b8420e781b7630d85a412b9112d892fea903f92b783427\\\"],\\\"sizeBytes\\\":513221333},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"],\\\"sizeBytes\\\":512274055},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:77fff570657d2fa0bfb709b2c8b6665bae0bf90a2be981d8dbca56c674715098\\\"],\\\"sizeBytes\\\":511227324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adb9f6f2fd701863c7caed747df43f83d3569ba9388cfa33ea7219ac6a606b11\\\"],\\\"sizeBytes\\\":511164375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458\\\"],\\\"sizeBytes\\\":508888171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263\\\"],\\\"sizeBytes\\\":508544745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71\\\"],\\\"sizeBytes\\\":507972093},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3\\\"],\\\"sizeBytes\\\":506480167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\"],\\\"sizeBytes\\\":506395599},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7d4a034950346bcd4e36e9e2f1343e0cf7a10cf544963f33d09c7eb2a1bfc634\\\"],\\\"sizeBytes\\\":505345991},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\\\"],\\\"sizeBytes\\\":505246690},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\\\"],\\\"sizeBytes\\\":504625081},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69\\\"],\\\"sizeBytes\\\":495994673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023\\\"],\\\"sizeBytes\\\":495065340},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:446bedea4916d3c1ee52be94137e484659e9561bd1de95c8189eee279aae984b\\\"],\\\"sizeBytes\\\":487096305},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a746a87b784ea1caa278fd0e012554f9df520b6fff665ea0bc4c83f487fed113\\\"],\\\"sizeBytes\\\":484450894},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c4d5a681595e428ff4b5083648c13615eed80be9084a3d3fc68a0295079cb12\\\"],\\\"sizeBytes\\\":484187929},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:908eaaf624959bc7645f6d585d160431d1efb070e9a1f37fefed73a3be42b0d3\\\"],\\\"sizeBytes\\\":470681292},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ea5c8a93f30e0a4932da5697d22c0da7eda9a7035c0555eb006b6755e62bb2fc\\\"],\\\"sizeBytes\\\":468265024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\\\"],\\\"sizeBytes\\\":465090934},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9609c00207cc4db97f0fd6162eb429d7f81654137f020a677e30cba26a887a24\\\"],\\\"sizeBytes\\\":463705930},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:632e80bba5077068ecca05fddb95aedebad4493af6f36152c01c6ae490975b62\\\"],\\\"sizeBytes\\\":458126937},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bcb08821551e9a5b9f82aa794bcea673279cefb93cb47492e19ccac5e2cf18fe\\\"],\\\"sizeBytes\\\":456576198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:759fb1d5353dbbadd443f38631d977ca3aed9787b873be05cc9660532a252739\\\"],\\\"sizeBytes\\\":448828620},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3062f6485aec4770e60852b535c69a42527b305161fe856499c8658ead6d1e85\\\"],\\\"sizeBytes\\\":448042136},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:951ecfeba9b2da4b653034d09275f925396a79c2d8461b8a7c71c776fee67ba0\\\"],\\\"sizeBytes\\\":443272037},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:292560e2d80b460468bb19fe0ddf289767c655027b03a76ee6c40c91ffe4c483\\\"],\\\"sizeBytes\\\":438654374},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0e66fd50be6f83ce321a566dfb76f3725b597374077d5af13813b928f6b1267e\\\"],\\\"sizeBytes\\\":411587146},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e3a494212f1ba17f0f0980eef583218330eccb56eadf6b8cb0548c76d99b5014\\\"],\\\"sizeBytes\\\":407347125},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:53d66d524ca3e787d8dbe30dbc4d9b8612c9cebd505ccb4375a8441814e85422\\\"],\\\"sizeBytes\\\":396521761}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:55:32.513672 master-0 kubenswrapper[6932]: I0319 11:55:32.513597 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_870e66ff-82ed-4c91-8197-dddcb78048c2/installer/0.log" Mar 19 11:55:32.513672 master-0 kubenswrapper[6932]: I0319 11:55:32.513656 6932 generic.go:334] "Generic (PLEG): container finished" podID="870e66ff-82ed-4c91-8197-dddcb78048c2" containerID="42a335ff2e41047c0beba4d30a5bd16330153a9f1ce92821358c191efd6f3fc9" exitCode=1 Mar 19 11:55:32.543861 master-0 kubenswrapper[6932]: I0319 11:55:32.543779 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:55:32.544027 master-0 kubenswrapper[6932]: I0319 11:55:32.543877 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:55:35.543089 master-0 kubenswrapper[6932]: I0319 11:55:35.543033 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:55:35.543745 master-0 kubenswrapper[6932]: I0319 11:55:35.543098 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:55:38.543787 master-0 kubenswrapper[6932]: I0319 11:55:38.543599 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:55:38.543787 master-0 kubenswrapper[6932]: I0319 11:55:38.543673 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:55:40.514820 master-0 kubenswrapper[6932]: E0319 11:55:40.514675 6932 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="800ms" Mar 19 11:55:41.543828 master-0 kubenswrapper[6932]: I0319 11:55:41.543695 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:55:41.543828 master-0 kubenswrapper[6932]: I0319 11:55:41.543800 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:55:42.050242 master-0 kubenswrapper[6932]: E0319 11:55:42.050169 6932 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:55:44.543228 master-0 kubenswrapper[6932]: I0319 11:55:44.543101 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:55:44.543228 master-0 kubenswrapper[6932]: I0319 11:55:44.543175 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:55:47.544164 master-0 kubenswrapper[6932]: I0319 11:55:47.544011 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:55:47.544164 master-0 kubenswrapper[6932]: I0319 11:55:47.544133 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:55:50.543991 master-0 kubenswrapper[6932]: I0319 11:55:50.543884 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:55:50.543991 master-0 kubenswrapper[6932]: I0319 11:55:50.543982 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:55:50.873271 master-0 kubenswrapper[6932]: E0319 11:55:50.869757 6932 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 19 11:55:50.873271 master-0 kubenswrapper[6932]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-rrvxk_openshift-marketplace_64f5cbf1-f761-4531-8e5c-1f9b318b0cb9_0(5653e876880f9c4e6e8bd5c622ba726ff16d52c7369d8668aaf6bf92d8fb83a1): error adding pod openshift-marketplace_community-operators-rrvxk to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"5653e876880f9c4e6e8bd5c622ba726ff16d52c7369d8668aaf6bf92d8fb83a1" Netns:"/var/run/netns/ac3422c8-1066-4839-8554-07872649a189" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-rrvxk;K8S_POD_INFRA_CONTAINER_ID=5653e876880f9c4e6e8bd5c622ba726ff16d52c7369d8668aaf6bf92d8fb83a1;K8S_POD_UID=64f5cbf1-f761-4531-8e5c-1f9b318b0cb9" Path:"" ERRORED: error configuring pod [openshift-marketplace/community-operators-rrvxk] networking: Multus: [openshift-marketplace/community-operators-rrvxk/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-rrvxk in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-rrvxk in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rrvxk?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 11:55:50.873271 master-0 kubenswrapper[6932]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 11:55:50.873271 master-0 kubenswrapper[6932]: > Mar 19 11:55:50.873271 master-0 kubenswrapper[6932]: E0319 11:55:50.869832 6932 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 19 11:55:50.873271 master-0 kubenswrapper[6932]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-rrvxk_openshift-marketplace_64f5cbf1-f761-4531-8e5c-1f9b318b0cb9_0(5653e876880f9c4e6e8bd5c622ba726ff16d52c7369d8668aaf6bf92d8fb83a1): error adding pod openshift-marketplace_community-operators-rrvxk to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"5653e876880f9c4e6e8bd5c622ba726ff16d52c7369d8668aaf6bf92d8fb83a1" Netns:"/var/run/netns/ac3422c8-1066-4839-8554-07872649a189" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-rrvxk;K8S_POD_INFRA_CONTAINER_ID=5653e876880f9c4e6e8bd5c622ba726ff16d52c7369d8668aaf6bf92d8fb83a1;K8S_POD_UID=64f5cbf1-f761-4531-8e5c-1f9b318b0cb9" Path:"" ERRORED: error configuring pod [openshift-marketplace/community-operators-rrvxk] networking: Multus: [openshift-marketplace/community-operators-rrvxk/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-rrvxk in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-rrvxk in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rrvxk?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 11:55:50.873271 master-0 kubenswrapper[6932]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 11:55:50.873271 master-0 kubenswrapper[6932]: > pod="openshift-marketplace/community-operators-rrvxk" Mar 19 11:55:50.873271 master-0 kubenswrapper[6932]: E0319 11:55:50.869851 6932 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 19 11:55:50.873271 master-0 kubenswrapper[6932]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-rrvxk_openshift-marketplace_64f5cbf1-f761-4531-8e5c-1f9b318b0cb9_0(5653e876880f9c4e6e8bd5c622ba726ff16d52c7369d8668aaf6bf92d8fb83a1): error adding pod openshift-marketplace_community-operators-rrvxk to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"5653e876880f9c4e6e8bd5c622ba726ff16d52c7369d8668aaf6bf92d8fb83a1" Netns:"/var/run/netns/ac3422c8-1066-4839-8554-07872649a189" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-rrvxk;K8S_POD_INFRA_CONTAINER_ID=5653e876880f9c4e6e8bd5c622ba726ff16d52c7369d8668aaf6bf92d8fb83a1;K8S_POD_UID=64f5cbf1-f761-4531-8e5c-1f9b318b0cb9" Path:"" ERRORED: error configuring pod [openshift-marketplace/community-operators-rrvxk] networking: Multus: [openshift-marketplace/community-operators-rrvxk/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-rrvxk in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-rrvxk in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rrvxk?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 11:55:50.873271 master-0 kubenswrapper[6932]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 11:55:50.873271 master-0 kubenswrapper[6932]: > pod="openshift-marketplace/community-operators-rrvxk" Mar 19 11:55:50.873271 master-0 kubenswrapper[6932]: E0319 11:55:50.869913 6932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"community-operators-rrvxk_openshift-marketplace(64f5cbf1-f761-4531-8e5c-1f9b318b0cb9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"community-operators-rrvxk_openshift-marketplace(64f5cbf1-f761-4531-8e5c-1f9b318b0cb9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-rrvxk_openshift-marketplace_64f5cbf1-f761-4531-8e5c-1f9b318b0cb9_0(5653e876880f9c4e6e8bd5c622ba726ff16d52c7369d8668aaf6bf92d8fb83a1): error adding pod openshift-marketplace_community-operators-rrvxk to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"5653e876880f9c4e6e8bd5c622ba726ff16d52c7369d8668aaf6bf92d8fb83a1\\\" Netns:\\\"/var/run/netns/ac3422c8-1066-4839-8554-07872649a189\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-rrvxk;K8S_POD_INFRA_CONTAINER_ID=5653e876880f9c4e6e8bd5c622ba726ff16d52c7369d8668aaf6bf92d8fb83a1;K8S_POD_UID=64f5cbf1-f761-4531-8e5c-1f9b318b0cb9\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/community-operators-rrvxk] networking: Multus: [openshift-marketplace/community-operators-rrvxk/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-rrvxk in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-rrvxk in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rrvxk?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/community-operators-rrvxk" podUID="64f5cbf1-f761-4531-8e5c-1f9b318b0cb9" Mar 19 11:55:51.316463 master-0 kubenswrapper[6932]: E0319 11:55:51.316337 6932 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Mar 19 11:55:52.051067 master-0 kubenswrapper[6932]: E0319 11:55:52.050916 6932 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:55:53.543581 master-0 kubenswrapper[6932]: I0319 11:55:53.543481 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:55:53.544257 master-0 kubenswrapper[6932]: I0319 11:55:53.543604 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:55:53.894068 master-0 kubenswrapper[6932]: E0319 11:55:53.893916 6932 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:55:53.894297 master-0 kubenswrapper[6932]: E0319 11:55:53.894127 6932 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.011s" Mar 19 11:55:53.894666 master-0 kubenswrapper[6932]: I0319 11:55:53.894626 6932 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:55:53.895946 master-0 kubenswrapper[6932]: I0319 11:55:53.895909 6932 scope.go:117] "RemoveContainer" containerID="b1921d5234eb4af4d7731c20be87a9595434841b33d272f8f2c3ade584fe4c62" Mar 19 11:55:53.896029 master-0 kubenswrapper[6932]: I0319 11:55:53.895953 6932 scope.go:117] "RemoveContainer" containerID="a63fe33504bcc71f9b4e0c9d251065dc432b3176905c1514b755fad213c3ed25" Mar 19 11:55:53.896203 master-0 kubenswrapper[6932]: I0319 11:55:53.896141 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:55:53.896274 master-0 kubenswrapper[6932]: I0319 11:55:53.896232 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:55:53.897552 master-0 kubenswrapper[6932]: I0319 11:55:53.896687 6932 scope.go:117] "RemoveContainer" containerID="a00e4976297d868e9d1a74ee69351e1ac6225f1b3fff400804a95076bf8deddd" Mar 19 11:55:53.898207 master-0 kubenswrapper[6932]: I0319 11:55:53.898136 6932 scope.go:117] "RemoveContainer" containerID="a35a4f30770261f78e16c8cbde80e6ad1d01d59985d717446c5cf700c3ca0a3e" Mar 19 11:55:53.900681 master-0 kubenswrapper[6932]: I0319 11:55:53.900611 6932 scope.go:117] "RemoveContainer" containerID="27aeacdf42166ebdfe7943145673659894eb1a05c94251adf45a06c9d05c04a8" Mar 19 11:55:53.903417 master-0 kubenswrapper[6932]: I0319 11:55:53.903371 6932 scope.go:117] "RemoveContainer" containerID="5b56b51126590bf802dd88d10f125adb62528aa19311215ff5bc2461894ca90f" Mar 19 11:55:53.903882 master-0 kubenswrapper[6932]: I0319 11:55:53.903846 6932 scope.go:117] "RemoveContainer" containerID="294d8a4101a65cda21ec3874afc3a3c7bd30756c657037c09db78beaa20e4b9c" Mar 19 11:55:53.904095 master-0 kubenswrapper[6932]: I0319 11:55:53.904058 6932 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"dce9d46436a1eb563a3c64b6a41398261e837f5052ca426ad3863d8198f90a75"} pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Mar 19 11:55:53.904133 master-0 kubenswrapper[6932]: I0319 11:55:53.904096 6932 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" containerID="cri-o://dce9d46436a1eb563a3c64b6a41398261e837f5052ca426ad3863d8198f90a75" gracePeriod=30 Mar 19 11:55:53.906708 master-0 kubenswrapper[6932]: I0319 11:55:53.906669 6932 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 19 11:55:54.645756 master-0 kubenswrapper[6932]: I0319 11:55:54.645689 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7bd846bfc4-7fz6w_3c3b0d24-ce5e-49c3-a546-874356f75dc6/network-operator/0.log" Mar 19 11:55:54.647269 master-0 kubenswrapper[6932]: I0319 11:55:54.647238 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-j528w_8438d015-106b-4aed-ae12-dda781ce51fc/approver/0.log" Mar 19 11:55:54.651691 master-0 kubenswrapper[6932]: I0319 11:55:54.651668 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-95bf4f4d-ng9ss_a3ceeece-bee9-4fcb-8517-95ebce38e223/openshift-config-operator/1.log" Mar 19 11:55:54.652292 master-0 kubenswrapper[6932]: I0319 11:55:54.652263 6932 generic.go:334] "Generic (PLEG): container finished" podID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerID="dce9d46436a1eb563a3c64b6a41398261e837f5052ca426ad3863d8198f90a75" exitCode=255 Mar 19 11:55:54.881280 master-0 kubenswrapper[6932]: I0319 11:55:54.881251 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_870e66ff-82ed-4c91-8197-dddcb78048c2/installer/0.log" Mar 19 11:55:54.881399 master-0 kubenswrapper[6932]: I0319 11:55:54.881313 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:55:55.022094 master-0 kubenswrapper[6932]: I0319 11:55:55.022038 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/870e66ff-82ed-4c91-8197-dddcb78048c2-var-lock\") pod \"870e66ff-82ed-4c91-8197-dddcb78048c2\" (UID: \"870e66ff-82ed-4c91-8197-dddcb78048c2\") " Mar 19 11:55:55.022094 master-0 kubenswrapper[6932]: I0319 11:55:55.022104 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/870e66ff-82ed-4c91-8197-dddcb78048c2-kubelet-dir\") pod \"870e66ff-82ed-4c91-8197-dddcb78048c2\" (UID: \"870e66ff-82ed-4c91-8197-dddcb78048c2\") " Mar 19 11:55:55.022380 master-0 kubenswrapper[6932]: I0319 11:55:55.022143 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/870e66ff-82ed-4c91-8197-dddcb78048c2-kube-api-access\") pod \"870e66ff-82ed-4c91-8197-dddcb78048c2\" (UID: \"870e66ff-82ed-4c91-8197-dddcb78048c2\") " Mar 19 11:55:55.022380 master-0 kubenswrapper[6932]: I0319 11:55:55.022161 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/870e66ff-82ed-4c91-8197-dddcb78048c2-var-lock" (OuterVolumeSpecName: "var-lock") pod "870e66ff-82ed-4c91-8197-dddcb78048c2" (UID: "870e66ff-82ed-4c91-8197-dddcb78048c2"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:55:55.022380 master-0 kubenswrapper[6932]: I0319 11:55:55.022200 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/870e66ff-82ed-4c91-8197-dddcb78048c2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "870e66ff-82ed-4c91-8197-dddcb78048c2" (UID: "870e66ff-82ed-4c91-8197-dddcb78048c2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:55:55.022380 master-0 kubenswrapper[6932]: I0319 11:55:55.022319 6932 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/870e66ff-82ed-4c91-8197-dddcb78048c2-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:55.022380 master-0 kubenswrapper[6932]: I0319 11:55:55.022336 6932 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/870e66ff-82ed-4c91-8197-dddcb78048c2-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:55.024696 master-0 kubenswrapper[6932]: I0319 11:55:55.024667 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/870e66ff-82ed-4c91-8197-dddcb78048c2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "870e66ff-82ed-4c91-8197-dddcb78048c2" (UID: "870e66ff-82ed-4c91-8197-dddcb78048c2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:55:55.123104 master-0 kubenswrapper[6932]: I0319 11:55:55.123030 6932 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/870e66ff-82ed-4c91-8197-dddcb78048c2-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:55.663567 master-0 kubenswrapper[6932]: I0319 11:55:55.663310 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_870e66ff-82ed-4c91-8197-dddcb78048c2/installer/0.log" Mar 19 11:55:55.663567 master-0 kubenswrapper[6932]: I0319 11:55:55.663428 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:55:56.543692 master-0 kubenswrapper[6932]: I0319 11:55:56.543556 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:55:56.544064 master-0 kubenswrapper[6932]: I0319 11:55:56.543694 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:55:57.171144 master-0 kubenswrapper[6932]: E0319 11:55:57.170982 6932 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{control-plane-machine-set-operator-6f97756bc8-j7rc9.189e3bfc050f456d openshift-machine-api 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-api,Name:control-plane-machine-set-operator-6f97756bc8-j7rc9,UID:7a51eeaf-1349-4bf3-932d-22ed5ce7c161,APIVersion:v1,ResourceVersion:8902,FieldPath:spec.containers{control-plane-machine-set-operator},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:908eaaf624959bc7645f6d585d160431d1efb070e9a1f37fefed73a3be42b0d3\" in 1.658s (1.658s including waiting). Image size: 470681292 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:54:16.833238381 +0000 UTC m=+81.192298603,LastTimestamp:2026-03-19 11:54:16.833238381 +0000 UTC m=+81.192298603,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:55:59.543554 master-0 kubenswrapper[6932]: I0319 11:55:59.543450 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:55:59.544217 master-0 kubenswrapper[6932]: I0319 11:55:59.543594 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:56:02.051493 master-0 kubenswrapper[6932]: E0319 11:56:02.051378 6932 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:56:02.543863 master-0 kubenswrapper[6932]: I0319 11:56:02.543717 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:56:02.543863 master-0 kubenswrapper[6932]: I0319 11:56:02.543858 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:56:02.918197 master-0 kubenswrapper[6932]: E0319 11:56:02.917767 6932 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Mar 19 11:56:05.543856 master-0 kubenswrapper[6932]: I0319 11:56:05.543625 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:56:05.544969 master-0 kubenswrapper[6932]: I0319 11:56:05.543832 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:56:08.544130 master-0 kubenswrapper[6932]: I0319 11:56:08.544026 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:56:08.544130 master-0 kubenswrapper[6932]: I0319 11:56:08.544120 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:56:08.744764 master-0 kubenswrapper[6932]: I0319 11:56:08.744660 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8c94f4649-6ghdm_e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf/openshift-controller-manager-operator/1.log" Mar 19 11:56:08.745712 master-0 kubenswrapper[6932]: I0319 11:56:08.745661 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8c94f4649-6ghdm_e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf/openshift-controller-manager-operator/0.log" Mar 19 11:56:08.745832 master-0 kubenswrapper[6932]: I0319 11:56:08.745746 6932 generic.go:334] "Generic (PLEG): container finished" podID="e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf" containerID="fd8c32d22caf0bf1b1f569479b7d959cb1e7f7190abe63f16601f2e5b50a0711" exitCode=255 Mar 19 11:56:11.543972 master-0 kubenswrapper[6932]: I0319 11:56:11.543883 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:56:11.544699 master-0 kubenswrapper[6932]: I0319 11:56:11.543981 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:56:12.052652 master-0 kubenswrapper[6932]: E0319 11:56:12.052573 6932 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:56:12.052652 master-0 kubenswrapper[6932]: E0319 11:56:12.052619 6932 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 11:56:12.767197 master-0 kubenswrapper[6932]: I0319 11:56:12.767141 6932 generic.go:334] "Generic (PLEG): container finished" podID="b3de8a1b-a5be-414f-86e8-738e16c8bc97" containerID="ecca7c744f565812652616c950bf4c3ba074defb48c439f60ea10ec59b205e80" exitCode=0 Mar 19 11:56:13.775482 master-0 kubenswrapper[6932]: I0319 11:56:13.775413 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-qrjj4_163d6a3d-0080-4122-bb7a-17f6e63f00f0/ingress-operator/0.log" Mar 19 11:56:13.775482 master-0 kubenswrapper[6932]: I0319 11:56:13.775460 6932 generic.go:334] "Generic (PLEG): container finished" podID="163d6a3d-0080-4122-bb7a-17f6e63f00f0" containerID="a5a674d7299c49bd88f1c56fca174966ef4c28920edc64023b6ce41812e041c8" exitCode=1 Mar 19 11:56:14.543693 master-0 kubenswrapper[6932]: I0319 11:56:14.543538 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:56:14.543693 master-0 kubenswrapper[6932]: I0319 11:56:14.543669 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:56:15.174100 master-0 kubenswrapper[6932]: I0319 11:56:15.174026 6932 status_manager.go:851] "Failed to get status for pod" podUID="96498b3d-c93f-4b42-a0aa-2afec3450b1d" pod="openshift-kube-scheduler/installer-1-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-1-master-0)" Mar 19 11:56:15.787007 master-0 kubenswrapper[6932]: I0319 11:56:15.786898 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-764k4_d625c81e-01cc-424a-997d-546a5204a72b/snapshot-controller/0.log" Mar 19 11:56:15.787007 master-0 kubenswrapper[6932]: I0319 11:56:15.786950 6932 generic.go:334] "Generic (PLEG): container finished" podID="d625c81e-01cc-424a-997d-546a5204a72b" containerID="e21a965ed4cb2dd18edb22058723998ac546681c370497fc8735a2d87bc17971" exitCode=1 Mar 19 11:56:16.119572 master-0 kubenswrapper[6932]: E0319 11:56:16.119359 6932 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Mar 19 11:56:17.055638 master-0 kubenswrapper[6932]: E0319 11:56:17.055563 6932 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 19 11:56:17.055638 master-0 kubenswrapper[6932]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-pn57d_openshift-marketplace_ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c_0(7233b9e7c26caa16f6d544d8285aaafb32e74d02bcb77192eef748f7ca56891e): error adding pod openshift-marketplace_certified-operators-pn57d to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7233b9e7c26caa16f6d544d8285aaafb32e74d02bcb77192eef748f7ca56891e" Netns:"/var/run/netns/ca7457bb-d75d-498f-9829-9295d49f0d64" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-pn57d;K8S_POD_INFRA_CONTAINER_ID=7233b9e7c26caa16f6d544d8285aaafb32e74d02bcb77192eef748f7ca56891e;K8S_POD_UID=ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c" Path:"" ERRORED: error configuring pod [openshift-marketplace/certified-operators-pn57d] networking: Multus: [openshift-marketplace/certified-operators-pn57d/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-pn57d in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-pn57d in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pn57d?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 11:56:17.055638 master-0 kubenswrapper[6932]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 11:56:17.055638 master-0 kubenswrapper[6932]: > Mar 19 11:56:17.056335 master-0 kubenswrapper[6932]: E0319 11:56:17.055668 6932 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 19 11:56:17.056335 master-0 kubenswrapper[6932]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-pn57d_openshift-marketplace_ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c_0(7233b9e7c26caa16f6d544d8285aaafb32e74d02bcb77192eef748f7ca56891e): error adding pod openshift-marketplace_certified-operators-pn57d to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7233b9e7c26caa16f6d544d8285aaafb32e74d02bcb77192eef748f7ca56891e" Netns:"/var/run/netns/ca7457bb-d75d-498f-9829-9295d49f0d64" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-pn57d;K8S_POD_INFRA_CONTAINER_ID=7233b9e7c26caa16f6d544d8285aaafb32e74d02bcb77192eef748f7ca56891e;K8S_POD_UID=ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c" Path:"" ERRORED: error configuring pod [openshift-marketplace/certified-operators-pn57d] networking: Multus: [openshift-marketplace/certified-operators-pn57d/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-pn57d in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-pn57d in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pn57d?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 11:56:17.056335 master-0 kubenswrapper[6932]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 11:56:17.056335 master-0 kubenswrapper[6932]: > pod="openshift-marketplace/certified-operators-pn57d" Mar 19 11:56:17.056335 master-0 kubenswrapper[6932]: E0319 11:56:17.055708 6932 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 19 11:56:17.056335 master-0 kubenswrapper[6932]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-pn57d_openshift-marketplace_ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c_0(7233b9e7c26caa16f6d544d8285aaafb32e74d02bcb77192eef748f7ca56891e): error adding pod openshift-marketplace_certified-operators-pn57d to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7233b9e7c26caa16f6d544d8285aaafb32e74d02bcb77192eef748f7ca56891e" Netns:"/var/run/netns/ca7457bb-d75d-498f-9829-9295d49f0d64" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-pn57d;K8S_POD_INFRA_CONTAINER_ID=7233b9e7c26caa16f6d544d8285aaafb32e74d02bcb77192eef748f7ca56891e;K8S_POD_UID=ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c" Path:"" ERRORED: error configuring pod [openshift-marketplace/certified-operators-pn57d] networking: Multus: [openshift-marketplace/certified-operators-pn57d/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-pn57d in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-pn57d in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pn57d?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 11:56:17.056335 master-0 kubenswrapper[6932]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 11:56:17.056335 master-0 kubenswrapper[6932]: > pod="openshift-marketplace/certified-operators-pn57d" Mar 19 11:56:17.056335 master-0 kubenswrapper[6932]: E0319 11:56:17.055875 6932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"certified-operators-pn57d_openshift-marketplace(ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"certified-operators-pn57d_openshift-marketplace(ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-pn57d_openshift-marketplace_ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c_0(7233b9e7c26caa16f6d544d8285aaafb32e74d02bcb77192eef748f7ca56891e): error adding pod openshift-marketplace_certified-operators-pn57d to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"7233b9e7c26caa16f6d544d8285aaafb32e74d02bcb77192eef748f7ca56891e\\\" Netns:\\\"/var/run/netns/ca7457bb-d75d-498f-9829-9295d49f0d64\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-pn57d;K8S_POD_INFRA_CONTAINER_ID=7233b9e7c26caa16f6d544d8285aaafb32e74d02bcb77192eef748f7ca56891e;K8S_POD_UID=ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/certified-operators-pn57d] networking: Multus: [openshift-marketplace/certified-operators-pn57d/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-pn57d in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-pn57d in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pn57d?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/certified-operators-pn57d" podUID="ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c" Mar 19 11:56:17.543674 master-0 kubenswrapper[6932]: I0319 11:56:17.543533 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:56:17.544046 master-0 kubenswrapper[6932]: I0319 11:56:17.543654 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:56:17.797506 master-0 kubenswrapper[6932]: I0319 11:56:17.797329 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pn57d" Mar 19 11:56:17.798025 master-0 kubenswrapper[6932]: I0319 11:56:17.797987 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pn57d" Mar 19 11:56:20.544326 master-0 kubenswrapper[6932]: I0319 11:56:20.544170 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:56:20.545378 master-0 kubenswrapper[6932]: I0319 11:56:20.544342 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:56:21.827535 master-0 kubenswrapper[6932]: I0319 11:56:21.827423 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-57777556ff-mjwfm_1b94d1eb-1b80-4a14-b1c0-d9e192231352/manager/0.log" Mar 19 11:56:21.827535 master-0 kubenswrapper[6932]: I0319 11:56:21.827491 6932 generic.go:334] "Generic (PLEG): container finished" podID="1b94d1eb-1b80-4a14-b1c0-d9e192231352" containerID="8fcb298ecd66e79f2851c7b4502a7734938f56462fb5de52ed324ec2a3679f14" exitCode=1 Mar 19 11:56:21.943101 master-0 kubenswrapper[6932]: I0319 11:56:21.943007 6932 patch_prober.go:28] interesting pod/marketplace-operator-89ccd998f-bftt4 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" start-of-body= Mar 19 11:56:21.943496 master-0 kubenswrapper[6932]: I0319 11:56:21.943105 6932 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" podUID="b3de8a1b-a5be-414f-86e8-738e16c8bc97" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" Mar 19 11:56:21.943496 master-0 kubenswrapper[6932]: I0319 11:56:21.943018 6932 patch_prober.go:28] interesting pod/marketplace-operator-89ccd998f-bftt4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" start-of-body= Mar 19 11:56:21.943496 master-0 kubenswrapper[6932]: I0319 11:56:21.943228 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" podUID="b3de8a1b-a5be-414f-86e8-738e16c8bc97" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" Mar 19 11:56:22.728390 master-0 kubenswrapper[6932]: I0319 11:56:22.728298 6932 patch_prober.go:28] interesting pod/operator-controller-controller-manager-57777556ff-mjwfm container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" start-of-body= Mar 19 11:56:22.728771 master-0 kubenswrapper[6932]: I0319 11:56:22.728388 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" podUID="1b94d1eb-1b80-4a14-b1c0-d9e192231352" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" Mar 19 11:56:22.834870 master-0 kubenswrapper[6932]: I0319 11:56:22.834648 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-6864dc98f7-xzxpq_376b18a9-5f33-44fd-a37b-20ab02c5e65d/manager/0.log" Mar 19 11:56:22.835454 master-0 kubenswrapper[6932]: I0319 11:56:22.835090 6932 generic.go:334] "Generic (PLEG): container finished" podID="376b18a9-5f33-44fd-a37b-20ab02c5e65d" containerID="4c09f5575088b49e0ef7e52a5eb347dfd8470474e6a6ff5bf019908a8d6b87bc" exitCode=1 Mar 19 11:56:23.543987 master-0 kubenswrapper[6932]: I0319 11:56:23.543882 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:56:23.543987 master-0 kubenswrapper[6932]: I0319 11:56:23.543974 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:56:26.543218 master-0 kubenswrapper[6932]: I0319 11:56:26.543102 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:56:26.543218 master-0 kubenswrapper[6932]: I0319 11:56:26.543211 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:56:27.910072 master-0 kubenswrapper[6932]: E0319 11:56:27.910017 6932 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:56:27.910636 master-0 kubenswrapper[6932]: E0319 11:56:27.910194 6932 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.016s" Mar 19 11:56:27.910636 master-0 kubenswrapper[6932]: I0319 11:56:27.910291 6932 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:56:27.910636 master-0 kubenswrapper[6932]: I0319 11:56:27.910303 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:56:27.910793 master-0 kubenswrapper[6932]: I0319 11:56:27.910717 6932 scope.go:117] "RemoveContainer" containerID="fd8c32d22caf0bf1b1f569479b7d959cb1e7f7190abe63f16601f2e5b50a0711" Mar 19 11:56:27.910849 master-0 kubenswrapper[6932]: I0319 11:56:27.910783 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rrvxk" Mar 19 11:56:27.911141 master-0 kubenswrapper[6932]: I0319 11:56:27.911102 6932 scope.go:117] "RemoveContainer" containerID="ffd01994498e412e963b01ac06f0e6ad28082a18471897dde077305cc7888366" Mar 19 11:56:27.911276 master-0 kubenswrapper[6932]: I0319 11:56:27.911241 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rrvxk" Mar 19 11:56:27.911763 master-0 kubenswrapper[6932]: I0319 11:56:27.911353 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:56:27.911763 master-0 kubenswrapper[6932]: I0319 11:56:27.911393 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:56:27.912243 master-0 kubenswrapper[6932]: I0319 11:56:27.912226 6932 scope.go:117] "RemoveContainer" containerID="b6e56f4e0942ab58cf693081930c0b921d6a49180ecc1e1f47356ba56a945538" Mar 19 11:56:27.913528 master-0 kubenswrapper[6932]: I0319 11:56:27.913510 6932 scope.go:117] "RemoveContainer" containerID="f48ebfe02dc1f93683f1d2eea873f5d0c2c3081e3483e2d09faebd411fa396ef" Mar 19 11:56:27.919492 master-0 kubenswrapper[6932]: I0319 11:56:27.919461 6932 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 19 11:56:28.886904 master-0 kubenswrapper[6932]: I0319 11:56:28.886867 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8c94f4649-6ghdm_e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf/openshift-controller-manager-operator/1.log" Mar 19 11:56:28.887817 master-0 kubenswrapper[6932]: I0319 11:56:28.887774 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8c94f4649-6ghdm_e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf/openshift-controller-manager-operator/0.log" Mar 19 11:56:29.543581 master-0 kubenswrapper[6932]: I0319 11:56:29.543472 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:56:29.543581 master-0 kubenswrapper[6932]: I0319 11:56:29.543554 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:56:29.544292 master-0 kubenswrapper[6932]: I0319 11:56:29.543603 6932 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:56:29.544292 master-0 kubenswrapper[6932]: I0319 11:56:29.543667 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:56:30.634613 master-0 kubenswrapper[6932]: I0319 11:56:30.634507 6932 patch_prober.go:28] interesting pod/catalogd-controller-manager-6864dc98f7-xzxpq container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.37:8081/readyz\": dial tcp 10.128.0.37:8081: connect: connection refused" start-of-body= Mar 19 11:56:30.635282 master-0 kubenswrapper[6932]: I0319 11:56:30.634765 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" podUID="376b18a9-5f33-44fd-a37b-20ab02c5e65d" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.37:8081/readyz\": dial tcp 10.128.0.37:8081: connect: connection refused" Mar 19 11:56:31.174986 master-0 kubenswrapper[6932]: E0319 11:56:31.174710 6932 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{control-plane-machine-set-operator-6f97756bc8-j7rc9.189e3bfc0abaa984 openshift-machine-api 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-api,Name:control-plane-machine-set-operator-6f97756bc8-j7rc9,UID:7a51eeaf-1349-4bf3-932d-22ed5ce7c161,APIVersion:v1,ResourceVersion:8902,FieldPath:spec.containers{control-plane-machine-set-operator},},Reason:Created,Message:Created container: control-plane-machine-set-operator,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:54:16.92835674 +0000 UTC m=+81.287416962,LastTimestamp:2026-03-19 11:54:16.92835674 +0000 UTC m=+81.287416962,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:56:31.943102 master-0 kubenswrapper[6932]: I0319 11:56:31.943036 6932 patch_prober.go:28] interesting pod/marketplace-operator-89ccd998f-bftt4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" start-of-body= Mar 19 11:56:31.943802 master-0 kubenswrapper[6932]: I0319 11:56:31.943124 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" podUID="b3de8a1b-a5be-414f-86e8-738e16c8bc97" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" Mar 19 11:56:31.943802 master-0 kubenswrapper[6932]: I0319 11:56:31.943679 6932 patch_prober.go:28] interesting pod/marketplace-operator-89ccd998f-bftt4 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" start-of-body= Mar 19 11:56:31.943802 master-0 kubenswrapper[6932]: I0319 11:56:31.943766 6932 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" podUID="b3de8a1b-a5be-414f-86e8-738e16c8bc97" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" Mar 19 11:56:32.088165 master-0 kubenswrapper[6932]: E0319 11:56:32.087969 6932 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T11:56:22Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T11:56:22Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T11:56:22Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T11:56:22Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\\\"],\\\"sizeBytes\\\":1637455533},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a55ec7ec64efd0f595d8084377b7e463a1807829b7617e5d4a9092dcd924c36\\\"],\\\"sizeBytes\\\":1238100502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:af0fe0ca926422a6471d5bf22fc0e682c36c24fba05496a3bdfac0b7d3733015\\\"],\\\"sizeBytes\\\":991832673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\"],\\\"sizeBytes\\\":943841779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0a09f5a3ba4f60cce0145769509bab92553c8075d210af4ac058965d2ae11efa\\\"],\\\"sizeBytes\\\":876160834},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6bba3f73c0066e42b24839e0d29f5dce2f36436f0a11f9f5e1029bccc5ed6578\\\"],\\\"sizeBytes\\\":862657321},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a9e8da5c6114f062b814936d4db7a47a04d248e160d6bb28ad4e4a081496ee4\\\"],\\\"sizeBytes\\\":772943435},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1e1faad2d9167d84e23585c1cea5962301845548043cf09578f943f79ca98016\\\"],\\\"sizeBytes\\\":687949580},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5e12e4dc52214d3ada5ba5106caebe079eac1d9292c2571a5fe83411ce8e900d\\\"],\\\"sizeBytes\\\":683195416},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa5e782406f71c048b1ac3a4bf5d1227ff4be81111114083ad4c7a209c6bfb5a\\\"],\\\"sizeBytes\\\":677942383},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ec8fd46dfb35ed10e8f98933166f69ce579c2f35b8db03d21e4c34fc544553e4\\\"],\\\"sizeBytes\\\":621648710},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998\\\"],\\\"sizeBytes\\\":589386806},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55\\\"],\\\"sizeBytes\\\":582154903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982\\\"],\\\"sizeBytes\\\":558211175},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"],\\\"sizeBytes\\\":548752816},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\"],\\\"sizeBytes\\\":529326739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b4c9cf268bb7abef7af187cd775d3f74d0bd33626250095428d53b705ee946\\\"],\\\"sizeBytes\\\":528956487},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\"],\\\"sizeBytes\\\":518384969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:59727c4b3fef19e5149675cf3350735bbfe2c6588a57654b2e4552dd719f58b1\\\"],\\\"sizeBytes\\\":517999161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\"],\\\"sizeBytes\\\":514984269},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bfe394b58ec6195de8b8420e781b7630d85a412b9112d892fea903f92b783427\\\"],\\\"sizeBytes\\\":513221333},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"],\\\"sizeBytes\\\":512274055},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:77fff570657d2fa0bfb709b2c8b6665bae0bf90a2be981d8dbca56c674715098\\\"],\\\"sizeBytes\\\":511227324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adb9f6f2fd701863c7caed747df43f83d3569ba9388cfa33ea7219ac6a606b11\\\"],\\\"sizeBytes\\\":511164375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458\\\"],\\\"sizeBytes\\\":508888171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263\\\"],\\\"sizeBytes\\\":508544745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71\\\"],\\\"sizeBytes\\\":507972093},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3\\\"],\\\"sizeBytes\\\":506480167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\"],\\\"sizeBytes\\\":506395599},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7d4a034950346bcd4e36e9e2f1343e0cf7a10cf544963f33d09c7eb2a1bfc634\\\"],\\\"sizeBytes\\\":505345991},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\\\"],\\\"sizeBytes\\\":505246690},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\\\"],\\\"sizeBytes\\\":504625081},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69\\\"],\\\"sizeBytes\\\":495994673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023\\\"],\\\"sizeBytes\\\":495065340},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:446bedea4916d3c1ee52be94137e484659e9561bd1de95c8189eee279aae984b\\\"],\\\"sizeBytes\\\":487096305},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a746a87b784ea1caa278fd0e012554f9df520b6fff665ea0bc4c83f487fed113\\\"],\\\"sizeBytes\\\":484450894},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c4d5a681595e428ff4b5083648c13615eed80be9084a3d3fc68a0295079cb12\\\"],\\\"sizeBytes\\\":484187929},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:908eaaf624959bc7645f6d585d160431d1efb070e9a1f37fefed73a3be42b0d3\\\"],\\\"sizeBytes\\\":470681292},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ea5c8a93f30e0a4932da5697d22c0da7eda9a7035c0555eb006b6755e62bb2fc\\\"],\\\"sizeBytes\\\":468265024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\\\"],\\\"sizeBytes\\\":465090934},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9609c00207cc4db97f0fd6162eb429d7f81654137f020a677e30cba26a887a24\\\"],\\\"sizeBytes\\\":463705930},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:632e80bba5077068ecca05fddb95aedebad4493af6f36152c01c6ae490975b62\\\"],\\\"sizeBytes\\\":458126937},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bcb08821551e9a5b9f82aa794bcea673279cefb93cb47492e19ccac5e2cf18fe\\\"],\\\"sizeBytes\\\":456576198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:759fb1d5353dbbadd443f38631d977ca3aed9787b873be05cc9660532a252739\\\"],\\\"sizeBytes\\\":448828620},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3062f6485aec4770e60852b535c69a42527b305161fe856499c8658ead6d1e85\\\"],\\\"sizeBytes\\\":448042136},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:951ecfeba9b2da4b653034d09275f925396a79c2d8461b8a7c71c776fee67ba0\\\"],\\\"sizeBytes\\\":443272037},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:292560e2d80b460468bb19fe0ddf289767c655027b03a76ee6c40c91ffe4c483\\\"],\\\"sizeBytes\\\":438654374},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0e66fd50be6f83ce321a566dfb76f3725b597374077d5af13813b928f6b1267e\\\"],\\\"sizeBytes\\\":411587146},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e3a494212f1ba17f0f0980eef583218330eccb56eadf6b8cb0548c76d99b5014\\\"],\\\"sizeBytes\\\":407347125},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:53d66d524ca3e787d8dbe30dbc4d9b8612c9cebd505ccb4375a8441814e85422\\\"],\\\"sizeBytes\\\":396521761}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:56:32.521652 master-0 kubenswrapper[6932]: E0319 11:56:32.521534 6932 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 19 11:56:32.543628 master-0 kubenswrapper[6932]: I0319 11:56:32.543492 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:56:32.543628 master-0 kubenswrapper[6932]: I0319 11:56:32.543617 6932 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:56:32.544056 master-0 kubenswrapper[6932]: I0319 11:56:32.543792 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:56:32.544056 master-0 kubenswrapper[6932]: I0319 11:56:32.543903 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:56:32.728354 master-0 kubenswrapper[6932]: I0319 11:56:32.728249 6932 patch_prober.go:28] interesting pod/operator-controller-controller-manager-57777556ff-mjwfm container/manager namespace/openshift-operator-controller: Liveness probe status=failure output="Get \"http://10.128.0.36:8081/healthz\": dial tcp 10.128.0.36:8081: connect: connection refused" start-of-body= Mar 19 11:56:32.728354 master-0 kubenswrapper[6932]: I0319 11:56:32.728319 6932 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" podUID="1b94d1eb-1b80-4a14-b1c0-d9e192231352" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.36:8081/healthz\": dial tcp 10.128.0.36:8081: connect: connection refused" Mar 19 11:56:32.728640 master-0 kubenswrapper[6932]: I0319 11:56:32.728344 6932 patch_prober.go:28] interesting pod/operator-controller-controller-manager-57777556ff-mjwfm container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" start-of-body= Mar 19 11:56:32.728640 master-0 kubenswrapper[6932]: I0319 11:56:32.728455 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" podUID="1b94d1eb-1b80-4a14-b1c0-d9e192231352" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" Mar 19 11:56:35.543033 master-0 kubenswrapper[6932]: I0319 11:56:35.542961 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:56:35.543033 master-0 kubenswrapper[6932]: I0319 11:56:35.543006 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:56:35.543690 master-0 kubenswrapper[6932]: I0319 11:56:35.543031 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:56:35.543690 master-0 kubenswrapper[6932]: I0319 11:56:35.543068 6932 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:56:38.543486 master-0 kubenswrapper[6932]: I0319 11:56:38.543374 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:56:38.543486 master-0 kubenswrapper[6932]: I0319 11:56:38.543449 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:56:40.634621 master-0 kubenswrapper[6932]: I0319 11:56:40.634540 6932 patch_prober.go:28] interesting pod/catalogd-controller-manager-6864dc98f7-xzxpq container/manager namespace/openshift-catalogd: Liveness probe status=failure output="Get \"http://10.128.0.37:8081/healthz\": dial tcp 10.128.0.37:8081: connect: connection refused" start-of-body= Mar 19 11:56:40.635127 master-0 kubenswrapper[6932]: I0319 11:56:40.634549 6932 patch_prober.go:28] interesting pod/catalogd-controller-manager-6864dc98f7-xzxpq container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.37:8081/readyz\": dial tcp 10.128.0.37:8081: connect: connection refused" start-of-body= Mar 19 11:56:40.635127 master-0 kubenswrapper[6932]: I0319 11:56:40.634697 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" podUID="376b18a9-5f33-44fd-a37b-20ab02c5e65d" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.37:8081/readyz\": dial tcp 10.128.0.37:8081: connect: connection refused" Mar 19 11:56:40.635127 master-0 kubenswrapper[6932]: I0319 11:56:40.634623 6932 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" podUID="376b18a9-5f33-44fd-a37b-20ab02c5e65d" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.37:8081/healthz\": dial tcp 10.128.0.37:8081: connect: connection refused" Mar 19 11:56:40.917020 master-0 kubenswrapper[6932]: E0319 11:56:40.916856 6932 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 19 11:56:41.543247 master-0 kubenswrapper[6932]: I0319 11:56:41.543189 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:56:41.543467 master-0 kubenswrapper[6932]: I0319 11:56:41.543274 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:56:41.943089 master-0 kubenswrapper[6932]: I0319 11:56:41.943043 6932 patch_prober.go:28] interesting pod/marketplace-operator-89ccd998f-bftt4 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" start-of-body= Mar 19 11:56:41.943089 master-0 kubenswrapper[6932]: I0319 11:56:41.943056 6932 patch_prober.go:28] interesting pod/marketplace-operator-89ccd998f-bftt4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" start-of-body= Mar 19 11:56:41.952159 master-0 kubenswrapper[6932]: I0319 11:56:41.943098 6932 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" podUID="b3de8a1b-a5be-414f-86e8-738e16c8bc97" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" Mar 19 11:56:41.952159 master-0 kubenswrapper[6932]: I0319 11:56:41.943112 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" podUID="b3de8a1b-a5be-414f-86e8-738e16c8bc97" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" Mar 19 11:56:41.970118 master-0 kubenswrapper[6932]: I0319 11:56:41.970061 6932 generic.go:334] "Generic (PLEG): container finished" podID="daf4dbb6-5a0a-4c92-a930-479a7330ace1" containerID="1961370e7c6f3b39c50205c4d3f694632a63f87701e4b9c1a6a05e005ec065b1" exitCode=0 Mar 19 11:56:42.089010 master-0 kubenswrapper[6932]: E0319 11:56:42.088830 6932 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:56:42.728870 master-0 kubenswrapper[6932]: I0319 11:56:42.728786 6932 patch_prober.go:28] interesting pod/operator-controller-controller-manager-57777556ff-mjwfm container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" start-of-body= Mar 19 11:56:42.729129 master-0 kubenswrapper[6932]: I0319 11:56:42.728870 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" podUID="1b94d1eb-1b80-4a14-b1c0-d9e192231352" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" Mar 19 11:56:44.546474 master-0 kubenswrapper[6932]: I0319 11:56:44.546406 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:56:44.546474 master-0 kubenswrapper[6932]: I0319 11:56:44.546477 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:56:47.543828 master-0 kubenswrapper[6932]: I0319 11:56:47.543761 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:56:47.544619 master-0 kubenswrapper[6932]: I0319 11:56:47.544585 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:56:49.523029 master-0 kubenswrapper[6932]: E0319 11:56:49.522900 6932 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 19 11:56:50.543187 master-0 kubenswrapper[6932]: I0319 11:56:50.543106 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:56:50.543940 master-0 kubenswrapper[6932]: I0319 11:56:50.543222 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:56:50.634772 master-0 kubenswrapper[6932]: I0319 11:56:50.634656 6932 patch_prober.go:28] interesting pod/catalogd-controller-manager-6864dc98f7-xzxpq container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.37:8081/readyz\": dial tcp 10.128.0.37:8081: connect: connection refused" start-of-body= Mar 19 11:56:50.635141 master-0 kubenswrapper[6932]: I0319 11:56:50.634827 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" podUID="376b18a9-5f33-44fd-a37b-20ab02c5e65d" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.37:8081/readyz\": dial tcp 10.128.0.37:8081: connect: connection refused" Mar 19 11:56:51.943678 master-0 kubenswrapper[6932]: I0319 11:56:51.943602 6932 patch_prober.go:28] interesting pod/marketplace-operator-89ccd998f-bftt4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" start-of-body= Mar 19 11:56:51.944288 master-0 kubenswrapper[6932]: I0319 11:56:51.943684 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" podUID="b3de8a1b-a5be-414f-86e8-738e16c8bc97" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" Mar 19 11:56:52.035370 master-0 kubenswrapper[6932]: I0319 11:56:52.035306 6932 generic.go:334] "Generic (PLEG): container finished" podID="76cf2b01-33d9-47eb-be5d-44946c78bf20" containerID="87cbd1b5cfb2e78754584648c786a0ccf511cf3452d3bed2f55e931cc6e6e1b5" exitCode=0 Mar 19 11:56:52.089671 master-0 kubenswrapper[6932]: E0319 11:56:52.089599 6932 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:56:52.728863 master-0 kubenswrapper[6932]: I0319 11:56:52.728802 6932 patch_prober.go:28] interesting pod/operator-controller-controller-manager-57777556ff-mjwfm container/manager namespace/openshift-operator-controller: Liveness probe status=failure output="Get \"http://10.128.0.36:8081/healthz\": dial tcp 10.128.0.36:8081: connect: connection refused" start-of-body= Mar 19 11:56:52.729099 master-0 kubenswrapper[6932]: I0319 11:56:52.728909 6932 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" podUID="1b94d1eb-1b80-4a14-b1c0-d9e192231352" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.36:8081/healthz\": dial tcp 10.128.0.36:8081: connect: connection refused" Mar 19 11:56:52.729099 master-0 kubenswrapper[6932]: I0319 11:56:52.728820 6932 patch_prober.go:28] interesting pod/operator-controller-controller-manager-57777556ff-mjwfm container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" start-of-body= Mar 19 11:56:52.729099 master-0 kubenswrapper[6932]: I0319 11:56:52.729004 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" podUID="1b94d1eb-1b80-4a14-b1c0-d9e192231352" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" Mar 19 11:56:53.380692 master-0 kubenswrapper[6932]: I0319 11:56:53.380594 6932 patch_prober.go:28] interesting pod/controller-manager-548bb99f44-txbjj container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.128.0.49:8443/healthz\": dial tcp 10.128.0.49:8443: connect: connection refused" start-of-body= Mar 19 11:56:53.381506 master-0 kubenswrapper[6932]: I0319 11:56:53.380710 6932 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" podUID="76cf2b01-33d9-47eb-be5d-44946c78bf20" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.49:8443/healthz\": dial tcp 10.128.0.49:8443: connect: connection refused" Mar 19 11:56:53.381506 master-0 kubenswrapper[6932]: I0319 11:56:53.380795 6932 patch_prober.go:28] interesting pod/controller-manager-548bb99f44-txbjj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.49:8443/healthz\": dial tcp 10.128.0.49:8443: connect: connection refused" start-of-body= Mar 19 11:56:53.381506 master-0 kubenswrapper[6932]: I0319 11:56:53.380973 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" podUID="76cf2b01-33d9-47eb-be5d-44946c78bf20" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.49:8443/healthz\": dial tcp 10.128.0.49:8443: connect: connection refused" Mar 19 11:56:53.542988 master-0 kubenswrapper[6932]: I0319 11:56:53.542877 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:56:53.542988 master-0 kubenswrapper[6932]: I0319 11:56:53.542952 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:56:56.060421 master-0 kubenswrapper[6932]: I0319 11:56:56.060365 6932 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="70e30dd45946084b4dbfa27658bf40bdaa54f00c37bf6e48547b5796a6b773e3" exitCode=1 Mar 19 11:56:57.543323 master-0 kubenswrapper[6932]: I0319 11:56:57.543198 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 11:56:57.543966 master-0 kubenswrapper[6932]: I0319 11:56:57.543337 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 11:57:00.543259 master-0 kubenswrapper[6932]: I0319 11:57:00.543117 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 11:57:00.543259 master-0 kubenswrapper[6932]: I0319 11:57:00.543236 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 11:57:00.634773 master-0 kubenswrapper[6932]: I0319 11:57:00.634638 6932 patch_prober.go:28] interesting pod/catalogd-controller-manager-6864dc98f7-xzxpq container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.37:8081/readyz\": dial tcp 10.128.0.37:8081: connect: connection refused" start-of-body= Mar 19 11:57:00.635164 master-0 kubenswrapper[6932]: I0319 11:57:00.634794 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" podUID="376b18a9-5f33-44fd-a37b-20ab02c5e65d" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.37:8081/readyz\": dial tcp 10.128.0.37:8081: connect: connection refused" Mar 19 11:57:00.635164 master-0 kubenswrapper[6932]: I0319 11:57:00.634694 6932 patch_prober.go:28] interesting pod/catalogd-controller-manager-6864dc98f7-xzxpq container/manager namespace/openshift-catalogd: Liveness probe status=failure output="Get \"http://10.128.0.37:8081/healthz\": dial tcp 10.128.0.37:8081: connect: connection refused" start-of-body= Mar 19 11:57:00.635164 master-0 kubenswrapper[6932]: I0319 11:57:00.634983 6932 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" podUID="376b18a9-5f33-44fd-a37b-20ab02c5e65d" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.37:8081/healthz\": dial tcp 10.128.0.37:8081: connect: connection refused" Mar 19 11:57:01.922026 master-0 kubenswrapper[6932]: E0319 11:57:01.921912 6932 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:57:01.922836 master-0 kubenswrapper[6932]: E0319 11:57:01.922177 6932 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.012s" Mar 19 11:57:01.922836 master-0 kubenswrapper[6932]: I0319 11:57:01.922216 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:57:01.922836 master-0 kubenswrapper[6932]: I0319 11:57:01.922653 6932 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:57:01.924172 master-0 kubenswrapper[6932]: I0319 11:57:01.924106 6932 scope.go:117] "RemoveContainer" containerID="70e30dd45946084b4dbfa27658bf40bdaa54f00c37bf6e48547b5796a6b773e3" Mar 19 11:57:01.924300 master-0 kubenswrapper[6932]: I0319 11:57:01.924273 6932 scope.go:117] "RemoveContainer" containerID="4c09f5575088b49e0ef7e52a5eb347dfd8470474e6a6ff5bf019908a8d6b87bc" Mar 19 11:57:01.924371 master-0 kubenswrapper[6932]: I0319 11:57:01.924323 6932 scope.go:117] "RemoveContainer" containerID="a5a674d7299c49bd88f1c56fca174966ef4c28920edc64023b6ce41812e041c8" Mar 19 11:57:01.924457 master-0 kubenswrapper[6932]: E0319 11:57:01.924424 6932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(46f265536aba6292ead501bc9b49f327)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" Mar 19 11:57:01.932593 master-0 kubenswrapper[6932]: I0319 11:57:01.932527 6932 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 19 11:57:01.949441 master-0 kubenswrapper[6932]: I0319 11:57:01.949350 6932 patch_prober.go:28] interesting pod/marketplace-operator-89ccd998f-bftt4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" start-of-body= Mar 19 11:57:01.949785 master-0 kubenswrapper[6932]: I0319 11:57:01.949449 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" podUID="b3de8a1b-a5be-414f-86e8-738e16c8bc97" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" Mar 19 11:57:02.089945 master-0 kubenswrapper[6932]: E0319 11:57:02.089905 6932 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:57:02.095245 master-0 kubenswrapper[6932]: I0319 11:57:02.095208 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-qrjj4_163d6a3d-0080-4122-bb7a-17f6e63f00f0/ingress-operator/0.log" Mar 19 11:57:02.095818 master-0 kubenswrapper[6932]: I0319 11:57:02.095785 6932 scope.go:117] "RemoveContainer" containerID="70e30dd45946084b4dbfa27658bf40bdaa54f00c37bf6e48547b5796a6b773e3" Mar 19 11:57:02.096056 master-0 kubenswrapper[6932]: E0319 11:57:02.096026 6932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(46f265536aba6292ead501bc9b49f327)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" Mar 19 11:57:02.728913 master-0 kubenswrapper[6932]: I0319 11:57:02.728807 6932 patch_prober.go:28] interesting pod/operator-controller-controller-manager-57777556ff-mjwfm container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" start-of-body= Mar 19 11:57:02.728913 master-0 kubenswrapper[6932]: I0319 11:57:02.728883 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" podUID="1b94d1eb-1b80-4a14-b1c0-d9e192231352" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" Mar 19 11:57:03.102209 master-0 kubenswrapper[6932]: I0319 11:57:03.102081 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-6864dc98f7-xzxpq_376b18a9-5f33-44fd-a37b-20ab02c5e65d/manager/0.log" Mar 19 11:57:03.380694 master-0 kubenswrapper[6932]: I0319 11:57:03.380570 6932 patch_prober.go:28] interesting pod/controller-manager-548bb99f44-txbjj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.49:8443/healthz\": dial tcp 10.128.0.49:8443: connect: connection refused" start-of-body= Mar 19 11:57:03.380694 master-0 kubenswrapper[6932]: I0319 11:57:03.380625 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" podUID="76cf2b01-33d9-47eb-be5d-44946c78bf20" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.49:8443/healthz\": dial tcp 10.128.0.49:8443: connect: connection refused" Mar 19 11:57:03.380947 master-0 kubenswrapper[6932]: I0319 11:57:03.380681 6932 patch_prober.go:28] interesting pod/controller-manager-548bb99f44-txbjj container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.128.0.49:8443/healthz\": dial tcp 10.128.0.49:8443: connect: connection refused" start-of-body= Mar 19 11:57:03.380947 master-0 kubenswrapper[6932]: I0319 11:57:03.380814 6932 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" podUID="76cf2b01-33d9-47eb-be5d-44946c78bf20" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.49:8443/healthz\": dial tcp 10.128.0.49:8443: connect: connection refused" Mar 19 11:57:03.542589 master-0 kubenswrapper[6932]: I0319 11:57:03.542516 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 11:57:03.542589 master-0 kubenswrapper[6932]: I0319 11:57:03.542599 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 11:57:05.178284 master-0 kubenswrapper[6932]: E0319 11:57:05.178123 6932 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{control-plane-machine-set-operator-6f97756bc8-j7rc9.189e3bfc0b52f3d2 openshift-machine-api 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-api,Name:control-plane-machine-set-operator-6f97756bc8-j7rc9,UID:7a51eeaf-1349-4bf3-932d-22ed5ce7c161,APIVersion:v1,ResourceVersion:8902,FieldPath:spec.containers{control-plane-machine-set-operator},},Reason:Started,Message:Started container control-plane-machine-set-operator,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:54:16.938337234 +0000 UTC m=+81.297397456,LastTimestamp:2026-03-19 11:54:16.938337234 +0000 UTC m=+81.297397456,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:57:06.524833 master-0 kubenswrapper[6932]: E0319 11:57:06.524707 6932 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 19 11:57:06.543097 master-0 kubenswrapper[6932]: I0319 11:57:06.542984 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 11:57:06.543296 master-0 kubenswrapper[6932]: I0319 11:57:06.543120 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 11:57:07.458011 master-0 kubenswrapper[6932]: I0319 11:57:07.457956 6932 patch_prober.go:28] interesting pod/etcd-operator-8544cbcf9c-9w7hc container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.23:8443/healthz\": net/http: TLS handshake timeout" start-of-body= Mar 19 11:57:07.458011 master-0 kubenswrapper[6932]: I0319 11:57:07.458015 6932 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" podUID="8fe4839d-cef4-4ec9-b146-2ae9b76d8a76" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.23:8443/healthz\": net/http: TLS handshake timeout" Mar 19 11:57:09.543033 master-0 kubenswrapper[6932]: I0319 11:57:09.542939 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 11:57:09.543703 master-0 kubenswrapper[6932]: I0319 11:57:09.543672 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 11:57:11.943883 master-0 kubenswrapper[6932]: I0319 11:57:11.943777 6932 patch_prober.go:28] interesting pod/marketplace-operator-89ccd998f-bftt4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" start-of-body= Mar 19 11:57:11.944612 master-0 kubenswrapper[6932]: I0319 11:57:11.943882 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" podUID="b3de8a1b-a5be-414f-86e8-738e16c8bc97" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" Mar 19 11:57:12.090466 master-0 kubenswrapper[6932]: E0319 11:57:12.090258 6932 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:57:12.090466 master-0 kubenswrapper[6932]: E0319 11:57:12.090315 6932 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 11:57:12.223508 master-0 kubenswrapper[6932]: E0319 11:57:12.223326 6932 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="10.3s" Mar 19 11:57:12.223508 master-0 kubenswrapper[6932]: I0319 11:57:12.223428 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:57:12.223508 master-0 kubenswrapper[6932]: I0319 11:57:12.223476 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:57:12.224545 master-0 kubenswrapper[6932]: I0319 11:57:12.224502 6932 scope.go:117] "RemoveContainer" containerID="ecca7c744f565812652616c950bf4c3ba074defb48c439f60ea10ec59b205e80" Mar 19 11:57:12.224818 master-0 kubenswrapper[6932]: I0319 11:57:12.224769 6932 scope.go:117] "RemoveContainer" containerID="70e30dd45946084b4dbfa27658bf40bdaa54f00c37bf6e48547b5796a6b773e3" Mar 19 11:57:12.225278 master-0 kubenswrapper[6932]: E0319 11:57:12.225129 6932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(46f265536aba6292ead501bc9b49f327)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" Mar 19 11:57:12.225507 master-0 kubenswrapper[6932]: I0319 11:57:12.225474 6932 scope.go:117] "RemoveContainer" containerID="e21a965ed4cb2dd18edb22058723998ac546681c370497fc8735a2d87bc17971" Mar 19 11:57:12.243830 master-0 kubenswrapper[6932]: I0319 11:57:12.243641 6932 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 19 11:57:12.247709 master-0 kubenswrapper[6932]: W0319 11:57:12.247659 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64f5cbf1_f761_4531_8e5c_1f9b318b0cb9.slice/crio-c79535746841947fd388ae5680d8b49dbd8b1a4914cee13d44e9d1f304783117 WatchSource:0}: Error finding container c79535746841947fd388ae5680d8b49dbd8b1a4914cee13d44e9d1f304783117: Status 404 returned error can't find the container with id c79535746841947fd388ae5680d8b49dbd8b1a4914cee13d44e9d1f304783117 Mar 19 11:57:12.248070 master-0 kubenswrapper[6932]: I0319 11:57:12.248046 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:57:12.248070 master-0 kubenswrapper[6932]: I0319 11:57:12.248072 6932 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:57:12.248179 master-0 kubenswrapper[6932]: I0319 11:57:12.248082 6932 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:57:12.248179 master-0 kubenswrapper[6932]: I0319 11:57:12.248094 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:57:12.248179 master-0 kubenswrapper[6932]: I0319 11:57:12.248104 6932 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:57:12.248179 master-0 kubenswrapper[6932]: I0319 11:57:12.248140 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:57:12.248179 master-0 kubenswrapper[6932]: I0319 11:57:12.248152 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:57:12.248179 master-0 kubenswrapper[6932]: I0319 11:57:12.248164 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:57:12.248179 master-0 kubenswrapper[6932]: I0319 11:57:12.248176 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" event={"ID":"3c3b0d24-ce5e-49c3-a546-874356f75dc6","Type":"ContainerDied","Data":"a35a4f30770261f78e16c8cbde80e6ad1d01d59985d717446c5cf700c3ca0a3e"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248201 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"b76fef7000b310af498f0cffcb969b0c47b51465c0a751707ee0c2ff2e63eba3"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248215 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" event={"ID":"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76","Type":"ContainerDied","Data":"b1921d5234eb4af4d7731c20be87a9595434841b33d272f8f2c3ade584fe4c62"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248228 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-j528w" event={"ID":"8438d015-106b-4aed-ae12-dda781ce51fc","Type":"ContainerDied","Data":"27aeacdf42166ebdfe7943145673659894eb1a05c94251adf45a06c9d05c04a8"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248241 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" event={"ID":"66f88242-8b0b-4790-bbb6-445c19b34ee7","Type":"ContainerDied","Data":"f48ebfe02dc1f93683f1d2eea873f5d0c2c3081e3483e2d09faebd411fa396ef"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248253 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" event={"ID":"f5d73fef-1414-4b29-97ea-42e1c0b1ef18","Type":"ContainerDied","Data":"a00e4976297d868e9d1a74ee69351e1ac6225f1b3fff400804a95076bf8deddd"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248264 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" event={"ID":"b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d","Type":"ContainerDied","Data":"ffd01994498e412e963b01ac06f0e6ad28082a18471897dde077305cc7888366"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248276 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" event={"ID":"a3ceeece-bee9-4fcb-8517-95ebce38e223","Type":"ContainerDied","Data":"c2c2b96a5faf69402dfe85ec6b2718eb42ca2ecf78927fa96ef82a61fc3c2da6"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248289 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" event={"ID":"a3ceeece-bee9-4fcb-8517-95ebce38e223","Type":"ContainerStarted","Data":"dce9d46436a1eb563a3c64b6a41398261e837f5052ca426ad3863d8198f90a75"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248299 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" event={"ID":"9b61ea14-a7ea-49f3-9df4-5655765ddf7c","Type":"ContainerDied","Data":"a63fe33504bcc71f9b4e0c9d251065dc432b3176905c1514b755fad213c3ed25"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248312 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" event={"ID":"dbcbba74-ac53-4724-a217-4d9b85e7c1db","Type":"ContainerDied","Data":"b6e56f4e0942ab58cf693081930c0b921d6a49180ecc1e1f47356ba56a945538"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248323 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" event={"ID":"6611e325-6152-480c-9c2c-1b503e49ccd2","Type":"ContainerDied","Data":"5b56b51126590bf802dd88d10f125adb62528aa19311215ff5bc2461894ca90f"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248338 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerDied","Data":"294d8a4101a65cda21ec3874afc3a3c7bd30756c657037c09db78beaa20e4b9c"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248351 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"870e66ff-82ed-4c91-8197-dddcb78048c2","Type":"ContainerDied","Data":"42a335ff2e41047c0beba4d30a5bd16330153a9f1ce92821358c191efd6f3fc9"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248363 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" event={"ID":"f5d73fef-1414-4b29-97ea-42e1c0b1ef18","Type":"ContainerStarted","Data":"9a366d0a6601b2de748ff6433c4e58b33cd25c67994a3cc7b19deed31b193ce5"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248371 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" event={"ID":"3c3b0d24-ce5e-49c3-a546-874356f75dc6","Type":"ContainerStarted","Data":"05b8cd521fd3c020a5958732718cd5b93105a635a1a43569b2bb722ef0b36180"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248380 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-j528w" event={"ID":"8438d015-106b-4aed-ae12-dda781ce51fc","Type":"ContainerStarted","Data":"97eb1a465790bd720388085fc15badddd0717999fea7e03106e51d2d591513fd"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248390 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"70e30dd45946084b4dbfa27658bf40bdaa54f00c37bf6e48547b5796a6b773e3"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248400 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" event={"ID":"9b61ea14-a7ea-49f3-9df4-5655765ddf7c","Type":"ContainerStarted","Data":"6ed27e0d6c4ae09582d5a0f02eeacd245422eb62b0fb54b563d1fef122917d0c"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248409 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" event={"ID":"a3ceeece-bee9-4fcb-8517-95ebce38e223","Type":"ContainerDied","Data":"dce9d46436a1eb563a3c64b6a41398261e837f5052ca426ad3863d8198f90a75"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248419 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" event={"ID":"a3ceeece-bee9-4fcb-8517-95ebce38e223","Type":"ContainerStarted","Data":"6face1f4fa1bdc241551919e3ba7726fe31927b0d6796c2e5cb9454a7c5c0bf2"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248428 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" event={"ID":"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76","Type":"ContainerStarted","Data":"3d5a7dbfa3497c92e61b4edeb7e66bb18bf811b923a65cdd5130c751b03363dc"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248439 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" event={"ID":"6611e325-6152-480c-9c2c-1b503e49ccd2","Type":"ContainerStarted","Data":"fdba162fd51d8d9d6f5213fd9100e93a1c05f699bfe386d38fcf6148a74ac467"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248451 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"870e66ff-82ed-4c91-8197-dddcb78048c2","Type":"ContainerDied","Data":"e775e2237b3e13100c3e1ab188e2c83cffcee4d11a252841dd57b5b92e5e9841"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248464 6932 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e775e2237b3e13100c3e1ab188e2c83cffcee4d11a252841dd57b5b92e5e9841" Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248475 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" event={"ID":"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf","Type":"ContainerDied","Data":"fd8c32d22caf0bf1b1f569479b7d959cb1e7f7190abe63f16601f2e5b50a0711"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248489 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" event={"ID":"b3de8a1b-a5be-414f-86e8-738e16c8bc97","Type":"ContainerDied","Data":"ecca7c744f565812652616c950bf4c3ba074defb48c439f60ea10ec59b205e80"} Mar 19 11:57:12.248461 master-0 kubenswrapper[6932]: I0319 11:57:12.248501 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" event={"ID":"163d6a3d-0080-4122-bb7a-17f6e63f00f0","Type":"ContainerDied","Data":"a5a674d7299c49bd88f1c56fca174966ef4c28920edc64023b6ce41812e041c8"} Mar 19 11:57:12.250029 master-0 kubenswrapper[6932]: I0319 11:57:12.248514 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" event={"ID":"d625c81e-01cc-424a-997d-546a5204a72b","Type":"ContainerDied","Data":"e21a965ed4cb2dd18edb22058723998ac546681c370497fc8735a2d87bc17971"} Mar 19 11:57:12.250029 master-0 kubenswrapper[6932]: I0319 11:57:12.248527 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" event={"ID":"1b94d1eb-1b80-4a14-b1c0-d9e192231352","Type":"ContainerDied","Data":"8fcb298ecd66e79f2851c7b4502a7734938f56462fb5de52ed324ec2a3679f14"} Mar 19 11:57:12.250029 master-0 kubenswrapper[6932]: I0319 11:57:12.248539 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" event={"ID":"376b18a9-5f33-44fd-a37b-20ab02c5e65d","Type":"ContainerDied","Data":"4c09f5575088b49e0ef7e52a5eb347dfd8470474e6a6ff5bf019908a8d6b87bc"} Mar 19 11:57:12.250029 master-0 kubenswrapper[6932]: I0319 11:57:12.248549 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" event={"ID":"66f88242-8b0b-4790-bbb6-445c19b34ee7","Type":"ContainerStarted","Data":"aef3135a4740b3f0892de4712851aedbced779bb92df87ab75f715d784560839"} Mar 19 11:57:12.250029 master-0 kubenswrapper[6932]: I0319 11:57:12.248558 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" event={"ID":"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf","Type":"ContainerStarted","Data":"d37fe7bef4927a9fd81b919a3fab62a0a3e02270254857eb323fe9d818374132"} Mar 19 11:57:12.250029 master-0 kubenswrapper[6932]: I0319 11:57:12.248568 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" event={"ID":"dbcbba74-ac53-4724-a217-4d9b85e7c1db","Type":"ContainerStarted","Data":"9575c6811223a47118ebe1c3a86929b70364b064c96ab05397b49097821331c5"} Mar 19 11:57:12.250029 master-0 kubenswrapper[6932]: I0319 11:57:12.248576 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" event={"ID":"b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d","Type":"ContainerStarted","Data":"7219591da453ed42375208505406598d76510d7b310fc06ec8c884d91770e862"} Mar 19 11:57:12.250029 master-0 kubenswrapper[6932]: I0319 11:57:12.248586 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" event={"ID":"daf4dbb6-5a0a-4c92-a930-479a7330ace1","Type":"ContainerDied","Data":"1961370e7c6f3b39c50205c4d3f694632a63f87701e4b9c1a6a05e005ec065b1"} Mar 19 11:57:12.250029 master-0 kubenswrapper[6932]: I0319 11:57:12.248598 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"b342656179e33f18902581a908ce540ce6ef0dc91604b6d131a3f77e2a7348cf"} Mar 19 11:57:12.250029 master-0 kubenswrapper[6932]: I0319 11:57:12.248607 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"4aeac864a0b8c139910b7cc56c7cd968bf6d24973d0e32d571eccc06d033d0f5"} Mar 19 11:57:12.250029 master-0 kubenswrapper[6932]: I0319 11:57:12.248615 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"0250f49b8d891954793ad552b261b0ce750c83c05e6b10b449eb9f6c02bf16f9"} Mar 19 11:57:12.250029 master-0 kubenswrapper[6932]: I0319 11:57:12.248623 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"06671c3758623dfa519c5bba4e475806636df7ef1dd7182a02cae6c91baa2e46"} Mar 19 11:57:12.250029 master-0 kubenswrapper[6932]: I0319 11:57:12.248631 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"c2bb91041db17b87be85528b31c2480989756f9c7e485dd2cc9a4b6bbe2f021b"} Mar 19 11:57:12.250029 master-0 kubenswrapper[6932]: I0319 11:57:12.248640 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" event={"ID":"76cf2b01-33d9-47eb-be5d-44946c78bf20","Type":"ContainerDied","Data":"87cbd1b5cfb2e78754584648c786a0ccf511cf3452d3bed2f55e931cc6e6e1b5"} Mar 19 11:57:12.250029 master-0 kubenswrapper[6932]: I0319 11:57:12.248651 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerDied","Data":"70e30dd45946084b4dbfa27658bf40bdaa54f00c37bf6e48547b5796a6b773e3"} Mar 19 11:57:12.250029 master-0 kubenswrapper[6932]: I0319 11:57:12.248662 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" event={"ID":"163d6a3d-0080-4122-bb7a-17f6e63f00f0","Type":"ContainerStarted","Data":"ceffe432bb3380aafe0729954185b3652b99ca21e97ac6c1e688d47217f36148"} Mar 19 11:57:12.250029 master-0 kubenswrapper[6932]: I0319 11:57:12.248670 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" event={"ID":"376b18a9-5f33-44fd-a37b-20ab02c5e65d","Type":"ContainerStarted","Data":"2488db84b0849c81166877e395ec16ae06df9df840cc1e0200c1e2aef0f75b5f"} Mar 19 11:57:12.250029 master-0 kubenswrapper[6932]: I0319 11:57:12.249025 6932 scope.go:117] "RemoveContainer" containerID="87cbd1b5cfb2e78754584648c786a0ccf511cf3452d3bed2f55e931cc6e6e1b5" Mar 19 11:57:12.250029 master-0 kubenswrapper[6932]: I0319 11:57:12.249462 6932 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"6face1f4fa1bdc241551919e3ba7726fe31927b0d6796c2e5cb9454a7c5c0bf2"} pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Mar 19 11:57:12.250029 master-0 kubenswrapper[6932]: I0319 11:57:12.249503 6932 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" containerID="cri-o://6face1f4fa1bdc241551919e3ba7726fe31927b0d6796c2e5cb9454a7c5c0bf2" gracePeriod=30 Mar 19 11:57:12.250029 master-0 kubenswrapper[6932]: I0319 11:57:12.249658 6932 scope.go:117] "RemoveContainer" containerID="c2c2b96a5faf69402dfe85ec6b2718eb42ca2ecf78927fa96ef82a61fc3c2da6" Mar 19 11:57:12.250029 master-0 kubenswrapper[6932]: I0319 11:57:12.249856 6932 scope.go:117] "RemoveContainer" containerID="8fcb298ecd66e79f2851c7b4502a7734938f56462fb5de52ed324ec2a3679f14" Mar 19 11:57:12.251585 master-0 kubenswrapper[6932]: I0319 11:57:12.251533 6932 scope.go:117] "RemoveContainer" containerID="1961370e7c6f3b39c50205c4d3f694632a63f87701e4b9c1a6a05e005ec065b1" Mar 19 11:57:12.282959 master-0 kubenswrapper[6932]: I0319 11:57:12.278919 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": read tcp 10.128.0.2:58374->10.128.0.17:8443: read: connection reset by peer" start-of-body= Mar 19 11:57:12.282959 master-0 kubenswrapper[6932]: I0319 11:57:12.278999 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": read tcp 10.128.0.2:58374->10.128.0.17:8443: read: connection reset by peer" Mar 19 11:57:12.282959 master-0 kubenswrapper[6932]: I0319 11:57:12.279390 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 19 11:57:12.282959 master-0 kubenswrapper[6932]: I0319 11:57:12.279400 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 19 11:57:12.282959 master-0 kubenswrapper[6932]: I0319 11:57:12.279413 6932 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="da8b94e0-a4d1-4071-adcd-56afcd1b0fbd" Mar 19 11:57:12.282959 master-0 kubenswrapper[6932]: I0319 11:57:12.279424 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 19 11:57:12.282959 master-0 kubenswrapper[6932]: I0319 11:57:12.282635 6932 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 19 11:57:12.282959 master-0 kubenswrapper[6932]: I0319 11:57:12.282691 6932 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="da8b94e0-a4d1-4071-adcd-56afcd1b0fbd" Mar 19 11:57:12.285900 master-0 kubenswrapper[6932]: I0319 11:57:12.285826 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pn57d"] Mar 19 11:57:12.286691 master-0 kubenswrapper[6932]: I0319 11:57:12.286607 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rrvxk"] Mar 19 11:57:12.345766 master-0 kubenswrapper[6932]: I0319 11:57:12.345155 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-j7rc9" podStartSLOduration=177.686991069 podStartE2EDuration="2m59.345136802s" podCreationTimestamp="2026-03-19 11:54:13 +0000 UTC" firstStartedPulling="2026-03-19 11:54:15.175079458 +0000 UTC m=+79.534139680" lastFinishedPulling="2026-03-19 11:54:16.833225191 +0000 UTC m=+81.192285413" observedRunningTime="2026-03-19 11:57:12.305919144 +0000 UTC m=+256.664979366" watchObservedRunningTime="2026-03-19 11:57:12.345136802 +0000 UTC m=+256.704197024" Mar 19 11:57:12.432998 master-0 kubenswrapper[6932]: I0319 11:57:12.432958 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 19 11:57:12.434002 master-0 kubenswrapper[6932]: I0319 11:57:12.433955 6932 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 19 11:57:12.434134 master-0 kubenswrapper[6932]: I0319 11:57:12.434106 6932 scope.go:117] "RemoveContainer" containerID="294d8a4101a65cda21ec3874afc3a3c7bd30756c657037c09db78beaa20e4b9c" Mar 19 11:57:12.468538 master-0 kubenswrapper[6932]: I0319 11:57:12.467925 6932 scope.go:117] "RemoveContainer" containerID="58cc59848776f9368dd32da99bd6c9b9284f95df012df470d98ae16fe81785f6" Mar 19 11:57:12.553018 master-0 kubenswrapper[6932]: I0319 11:57:12.550014 6932 scope.go:117] "RemoveContainer" containerID="c2c2b96a5faf69402dfe85ec6b2718eb42ca2ecf78927fa96ef82a61fc3c2da6" Mar 19 11:57:12.553018 master-0 kubenswrapper[6932]: E0319 11:57:12.551138 6932 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2c2b96a5faf69402dfe85ec6b2718eb42ca2ecf78927fa96ef82a61fc3c2da6\": container with ID starting with c2c2b96a5faf69402dfe85ec6b2718eb42ca2ecf78927fa96ef82a61fc3c2da6 not found: ID does not exist" containerID="c2c2b96a5faf69402dfe85ec6b2718eb42ca2ecf78927fa96ef82a61fc3c2da6" Mar 19 11:57:12.553018 master-0 kubenswrapper[6932]: I0319 11:57:12.551171 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2c2b96a5faf69402dfe85ec6b2718eb42ca2ecf78927fa96ef82a61fc3c2da6"} err="failed to get container status \"c2c2b96a5faf69402dfe85ec6b2718eb42ca2ecf78927fa96ef82a61fc3c2da6\": rpc error: code = NotFound desc = could not find container \"c2c2b96a5faf69402dfe85ec6b2718eb42ca2ecf78927fa96ef82a61fc3c2da6\": container with ID starting with c2c2b96a5faf69402dfe85ec6b2718eb42ca2ecf78927fa96ef82a61fc3c2da6 not found: ID does not exist" Mar 19 11:57:12.553018 master-0 kubenswrapper[6932]: I0319 11:57:12.551194 6932 scope.go:117] "RemoveContainer" containerID="3ab6a68db657d0e7924cc47a81bc9831d8055a58f93210e34c6ef5c5b5597505" Mar 19 11:57:12.596206 master-0 kubenswrapper[6932]: I0319 11:57:12.595854 6932 scope.go:117] "RemoveContainer" containerID="294d8a4101a65cda21ec3874afc3a3c7bd30756c657037c09db78beaa20e4b9c" Mar 19 11:57:12.596446 master-0 kubenswrapper[6932]: E0319 11:57:12.596348 6932 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"294d8a4101a65cda21ec3874afc3a3c7bd30756c657037c09db78beaa20e4b9c\": container with ID starting with 294d8a4101a65cda21ec3874afc3a3c7bd30756c657037c09db78beaa20e4b9c not found: ID does not exist" containerID="294d8a4101a65cda21ec3874afc3a3c7bd30756c657037c09db78beaa20e4b9c" Mar 19 11:57:12.596446 master-0 kubenswrapper[6932]: I0319 11:57:12.596404 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"294d8a4101a65cda21ec3874afc3a3c7bd30756c657037c09db78beaa20e4b9c"} err="failed to get container status \"294d8a4101a65cda21ec3874afc3a3c7bd30756c657037c09db78beaa20e4b9c\": rpc error: code = NotFound desc = could not find container \"294d8a4101a65cda21ec3874afc3a3c7bd30756c657037c09db78beaa20e4b9c\": container with ID starting with 294d8a4101a65cda21ec3874afc3a3c7bd30756c657037c09db78beaa20e4b9c not found: ID does not exist" Mar 19 11:57:12.596446 master-0 kubenswrapper[6932]: I0319 11:57:12.596433 6932 scope.go:117] "RemoveContainer" containerID="58cc59848776f9368dd32da99bd6c9b9284f95df012df470d98ae16fe81785f6" Mar 19 11:57:12.598408 master-0 kubenswrapper[6932]: E0319 11:57:12.598109 6932 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58cc59848776f9368dd32da99bd6c9b9284f95df012df470d98ae16fe81785f6\": container with ID starting with 58cc59848776f9368dd32da99bd6c9b9284f95df012df470d98ae16fe81785f6 not found: ID does not exist" containerID="58cc59848776f9368dd32da99bd6c9b9284f95df012df470d98ae16fe81785f6" Mar 19 11:57:12.598408 master-0 kubenswrapper[6932]: I0319 11:57:12.598135 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58cc59848776f9368dd32da99bd6c9b9284f95df012df470d98ae16fe81785f6"} err="failed to get container status \"58cc59848776f9368dd32da99bd6c9b9284f95df012df470d98ae16fe81785f6\": rpc error: code = NotFound desc = could not find container \"58cc59848776f9368dd32da99bd6c9b9284f95df012df470d98ae16fe81785f6\": container with ID starting with 58cc59848776f9368dd32da99bd6c9b9284f95df012df470d98ae16fe81785f6 not found: ID does not exist" Mar 19 11:57:12.731893 master-0 kubenswrapper[6932]: I0319 11:57:12.731832 6932 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:57:12.814609 master-0 kubenswrapper[6932]: I0319 11:57:12.814553 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 19 11:57:12.946932 master-0 kubenswrapper[6932]: I0319 11:57:12.946868 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 19 11:57:12.950842 master-0 kubenswrapper[6932]: I0319 11:57:12.950803 6932 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 19 11:57:13.172063 master-0 kubenswrapper[6932]: I0319 11:57:13.171905 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" event={"ID":"b3de8a1b-a5be-414f-86e8-738e16c8bc97","Type":"ContainerStarted","Data":"147ea002de1d61b828f3e4f59b89474a76a533a161c3a8b138665844ccc9c433"} Mar 19 11:57:13.172063 master-0 kubenswrapper[6932]: I0319 11:57:13.172052 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:57:13.174667 master-0 kubenswrapper[6932]: I0319 11:57:13.174639 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:57:13.176620 master-0 kubenswrapper[6932]: I0319 11:57:13.176566 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8c94f4649-6ghdm_e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf/openshift-controller-manager-operator/1.log" Mar 19 11:57:13.179176 master-0 kubenswrapper[6932]: I0319 11:57:13.179129 6932 scope.go:117] "RemoveContainer" containerID="70e30dd45946084b4dbfa27658bf40bdaa54f00c37bf6e48547b5796a6b773e3" Mar 19 11:57:13.179470 master-0 kubenswrapper[6932]: E0319 11:57:13.179409 6932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(46f265536aba6292ead501bc9b49f327)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" Mar 19 11:57:13.181373 master-0 kubenswrapper[6932]: I0319 11:57:13.181344 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-57777556ff-mjwfm_1b94d1eb-1b80-4a14-b1c0-d9e192231352/manager/0.log" Mar 19 11:57:13.181516 master-0 kubenswrapper[6932]: I0319 11:57:13.181477 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" event={"ID":"1b94d1eb-1b80-4a14-b1c0-d9e192231352","Type":"ContainerStarted","Data":"54f6f1a412b81f0f7c7a43eff29ebb6260a16932752b0d9e46f5d27af722be26"} Mar 19 11:57:13.181647 master-0 kubenswrapper[6932]: I0319 11:57:13.181618 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:57:13.183342 master-0 kubenswrapper[6932]: I0319 11:57:13.183314 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-764k4_d625c81e-01cc-424a-997d-546a5204a72b/snapshot-controller/0.log" Mar 19 11:57:13.183409 master-0 kubenswrapper[6932]: I0319 11:57:13.183373 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" event={"ID":"d625c81e-01cc-424a-997d-546a5204a72b","Type":"ContainerStarted","Data":"853d3eee88157502b76c3c9b20b3de3f2808774e2eca0856840a19b4a56c5c18"} Mar 19 11:57:13.185416 master-0 kubenswrapper[6932]: I0319 11:57:13.185380 6932 generic.go:334] "Generic (PLEG): container finished" podID="64f5cbf1-f761-4531-8e5c-1f9b318b0cb9" containerID="a9e27f322bd00992bb4993c33f61aa9b49e3677fa4a2096a53f4ef8f27534fba" exitCode=0 Mar 19 11:57:13.185474 master-0 kubenswrapper[6932]: I0319 11:57:13.185465 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rrvxk" event={"ID":"64f5cbf1-f761-4531-8e5c-1f9b318b0cb9","Type":"ContainerDied","Data":"a9e27f322bd00992bb4993c33f61aa9b49e3677fa4a2096a53f4ef8f27534fba"} Mar 19 11:57:13.185555 master-0 kubenswrapper[6932]: I0319 11:57:13.185529 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rrvxk" event={"ID":"64f5cbf1-f761-4531-8e5c-1f9b318b0cb9","Type":"ContainerStarted","Data":"c79535746841947fd388ae5680d8b49dbd8b1a4914cee13d44e9d1f304783117"} Mar 19 11:57:13.187924 master-0 kubenswrapper[6932]: I0319 11:57:13.187776 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-95bf4f4d-ng9ss_a3ceeece-bee9-4fcb-8517-95ebce38e223/openshift-config-operator/2.log" Mar 19 11:57:13.188643 master-0 kubenswrapper[6932]: I0319 11:57:13.188620 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-95bf4f4d-ng9ss_a3ceeece-bee9-4fcb-8517-95ebce38e223/openshift-config-operator/1.log" Mar 19 11:57:13.189205 master-0 kubenswrapper[6932]: I0319 11:57:13.189181 6932 generic.go:334] "Generic (PLEG): container finished" podID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerID="6face1f4fa1bdc241551919e3ba7726fe31927b0d6796c2e5cb9454a7c5c0bf2" exitCode=255 Mar 19 11:57:13.189309 master-0 kubenswrapper[6932]: I0319 11:57:13.189233 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" event={"ID":"a3ceeece-bee9-4fcb-8517-95ebce38e223","Type":"ContainerDied","Data":"6face1f4fa1bdc241551919e3ba7726fe31927b0d6796c2e5cb9454a7c5c0bf2"} Mar 19 11:57:13.189309 master-0 kubenswrapper[6932]: I0319 11:57:13.189257 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" event={"ID":"a3ceeece-bee9-4fcb-8517-95ebce38e223","Type":"ContainerStarted","Data":"458cf7cd156450c202f524633ff9f83e2511879ffdc708449094125b8667a8bc"} Mar 19 11:57:13.189309 master-0 kubenswrapper[6932]: I0319 11:57:13.189274 6932 scope.go:117] "RemoveContainer" containerID="dce9d46436a1eb563a3c64b6a41398261e837f5052ca426ad3863d8198f90a75" Mar 19 11:57:13.190081 master-0 kubenswrapper[6932]: I0319 11:57:13.189581 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:57:13.194270 master-0 kubenswrapper[6932]: I0319 11:57:13.194228 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" event={"ID":"daf4dbb6-5a0a-4c92-a930-479a7330ace1","Type":"ContainerStarted","Data":"e906b08d672ce284b02875a6853f0e751eac277b721f1ba103b6a1fce5dcd578"} Mar 19 11:57:13.204779 master-0 kubenswrapper[6932]: I0319 11:57:13.204722 6932 generic.go:334] "Generic (PLEG): container finished" podID="ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c" containerID="2178ca0dec99a5992f2fd9c450e6e82583d472a7030f3cf5c76181f940b5ee3b" exitCode=0 Mar 19 11:57:13.204986 master-0 kubenswrapper[6932]: I0319 11:57:13.204964 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn57d" event={"ID":"ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c","Type":"ContainerDied","Data":"2178ca0dec99a5992f2fd9c450e6e82583d472a7030f3cf5c76181f940b5ee3b"} Mar 19 11:57:13.205349 master-0 kubenswrapper[6932]: I0319 11:57:13.205066 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn57d" event={"ID":"ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c","Type":"ContainerStarted","Data":"b65208e52222fdb3f7c61852c66c50026f3faa77384c9fee6973005796070926"} Mar 19 11:57:13.207882 master-0 kubenswrapper[6932]: I0319 11:57:13.207577 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" event={"ID":"76cf2b01-33d9-47eb-be5d-44946c78bf20","Type":"ContainerStarted","Data":"f86e998e3bf1f66c4662c3a819ee87d4b1804f14a9851d9972eed9aee129f60b"} Mar 19 11:57:13.208389 master-0 kubenswrapper[6932]: I0319 11:57:13.208355 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:57:13.213192 master-0 kubenswrapper[6932]: I0319 11:57:13.213123 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:57:13.878059 master-0 kubenswrapper[6932]: I0319 11:57:13.878002 6932 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96498b3d-c93f-4b42-a0aa-2afec3450b1d" path="/var/lib/kubelet/pods/96498b3d-c93f-4b42-a0aa-2afec3450b1d/volumes" Mar 19 11:57:13.878529 master-0 kubenswrapper[6932]: I0319 11:57:13.878510 6932 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c385dd73-4a25-4827-9c8f-d923afc782b7" path="/var/lib/kubelet/pods/c385dd73-4a25-4827-9c8f-d923afc782b7/volumes" Mar 19 11:57:14.221500 master-0 kubenswrapper[6932]: I0319 11:57:14.221446 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-95bf4f4d-ng9ss_a3ceeece-bee9-4fcb-8517-95ebce38e223/openshift-config-operator/2.log" Mar 19 11:57:17.815190 master-0 kubenswrapper[6932]: I0319 11:57:17.814430 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 19 11:57:17.837630 master-0 kubenswrapper[6932]: I0319 11:57:17.837576 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 19 11:57:18.543252 master-0 kubenswrapper[6932]: I0319 11:57:18.543172 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 11:57:18.543513 master-0 kubenswrapper[6932]: I0319 11:57:18.543255 6932 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 11:57:18.543513 master-0 kubenswrapper[6932]: I0319 11:57:18.543302 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 11:57:18.543513 master-0 kubenswrapper[6932]: I0319 11:57:18.543419 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 11:57:20.634036 master-0 kubenswrapper[6932]: I0319 11:57:20.633952 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:57:20.636344 master-0 kubenswrapper[6932]: I0319 11:57:20.636239 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:57:21.544424 master-0 kubenswrapper[6932]: I0319 11:57:21.543863 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 11:57:21.544424 master-0 kubenswrapper[6932]: I0319 11:57:21.543862 6932 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-ng9ss container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 11:57:21.544424 master-0 kubenswrapper[6932]: I0319 11:57:21.543949 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 11:57:21.544424 master-0 kubenswrapper[6932]: I0319 11:57:21.544003 6932 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" podUID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 11:57:22.730486 master-0 kubenswrapper[6932]: I0319 11:57:22.730429 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:57:22.835171 master-0 kubenswrapper[6932]: I0319 11:57:22.835104 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 19 11:57:23.547476 master-0 kubenswrapper[6932]: I0319 11:57:23.547420 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:57:25.257004 master-0 kubenswrapper[6932]: E0319 11:57:25.256914 6932 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 19 11:57:26.421760 master-0 kubenswrapper[6932]: I0319 11:57:26.420154 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rrvxk" event={"ID":"64f5cbf1-f761-4531-8e5c-1f9b318b0cb9","Type":"ContainerStarted","Data":"0cf8f5f7d033dcecb519d5f330a3bdb1844539d062ac064cc4e8a74d131c9002"} Mar 19 11:57:26.422780 master-0 kubenswrapper[6932]: I0319 11:57:26.422588 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn57d" event={"ID":"ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c","Type":"ContainerStarted","Data":"8aee032fde2826ca2d43f3e69dc5b88ceb4ae164c2d67f37aeea31d0257a0d16"} Mar 19 11:57:27.427606 master-0 kubenswrapper[6932]: I0319 11:57:27.427542 6932 generic.go:334] "Generic (PLEG): container finished" podID="64f5cbf1-f761-4531-8e5c-1f9b318b0cb9" containerID="0cf8f5f7d033dcecb519d5f330a3bdb1844539d062ac064cc4e8a74d131c9002" exitCode=0 Mar 19 11:57:27.428176 master-0 kubenswrapper[6932]: I0319 11:57:27.427623 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rrvxk" event={"ID":"64f5cbf1-f761-4531-8e5c-1f9b318b0cb9","Type":"ContainerDied","Data":"0cf8f5f7d033dcecb519d5f330a3bdb1844539d062ac064cc4e8a74d131c9002"} Mar 19 11:57:27.429126 master-0 kubenswrapper[6932]: I0319 11:57:27.429091 6932 generic.go:334] "Generic (PLEG): container finished" podID="ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c" containerID="8aee032fde2826ca2d43f3e69dc5b88ceb4ae164c2d67f37aeea31d0257a0d16" exitCode=0 Mar 19 11:57:27.429126 master-0 kubenswrapper[6932]: I0319 11:57:27.429119 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn57d" event={"ID":"ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c","Type":"ContainerDied","Data":"8aee032fde2826ca2d43f3e69dc5b88ceb4ae164c2d67f37aeea31d0257a0d16"} Mar 19 11:57:27.917793 master-0 kubenswrapper[6932]: E0319 11:57:27.917712 6932 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" is forbidden: the server was unable to return a response in the time allotted, but may still be processing the request (get limitranges)" pod="openshift-etcd/etcd-master-0" Mar 19 11:57:28.440272 master-0 kubenswrapper[6932]: I0319 11:57:28.440206 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rrvxk" event={"ID":"64f5cbf1-f761-4531-8e5c-1f9b318b0cb9","Type":"ContainerStarted","Data":"405681b4e087bf984381117ea7b1c3ba995ceb7dc4c6f465e5df580e2befefed"} Mar 19 11:57:28.443477 master-0 kubenswrapper[6932]: I0319 11:57:28.443423 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn57d" event={"ID":"ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c","Type":"ContainerStarted","Data":"450b27691871726f5f4969eb74205be5dae4f706974b3e180e9633b26833df06"} Mar 19 11:57:28.461664 master-0 kubenswrapper[6932]: I0319 11:57:28.461538 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rrvxk" podStartSLOduration=180.768673637 podStartE2EDuration="3m15.461522046s" podCreationTimestamp="2026-03-19 11:54:13 +0000 UTC" firstStartedPulling="2026-03-19 11:57:13.186553572 +0000 UTC m=+257.545613794" lastFinishedPulling="2026-03-19 11:57:27.879401961 +0000 UTC m=+272.238462203" observedRunningTime="2026-03-19 11:57:28.460343897 +0000 UTC m=+272.819404119" watchObservedRunningTime="2026-03-19 11:57:28.461522046 +0000 UTC m=+272.820582268" Mar 19 11:57:28.483426 master-0 kubenswrapper[6932]: I0319 11:57:28.483363 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pn57d" podStartSLOduration=179.585833281 podStartE2EDuration="3m14.483347516s" podCreationTimestamp="2026-03-19 11:54:14 +0000 UTC" firstStartedPulling="2026-03-19 11:57:13.214192647 +0000 UTC m=+257.573252869" lastFinishedPulling="2026-03-19 11:57:28.111706882 +0000 UTC m=+272.470767104" observedRunningTime="2026-03-19 11:57:28.481761358 +0000 UTC m=+272.840821590" watchObservedRunningTime="2026-03-19 11:57:28.483347516 +0000 UTC m=+272.842407738" Mar 19 11:57:28.871147 master-0 kubenswrapper[6932]: I0319 11:57:28.871021 6932 scope.go:117] "RemoveContainer" containerID="70e30dd45946084b4dbfa27658bf40bdaa54f00c37bf6e48547b5796a6b773e3" Mar 19 11:57:29.451089 master-0 kubenswrapper[6932]: I0319 11:57:29.450999 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"b533d36029413fb01ef12a682704ad486041204246c172de95f0a4aeff2f5180"} Mar 19 11:57:29.496523 master-0 kubenswrapper[6932]: I0319 11:57:29.496463 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 19 11:57:29.496783 master-0 kubenswrapper[6932]: E0319 11:57:29.496679 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870e66ff-82ed-4c91-8197-dddcb78048c2" containerName="installer" Mar 19 11:57:29.496783 master-0 kubenswrapper[6932]: I0319 11:57:29.496695 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="870e66ff-82ed-4c91-8197-dddcb78048c2" containerName="installer" Mar 19 11:57:29.496783 master-0 kubenswrapper[6932]: E0319 11:57:29.496715 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bde080b-3820-463f-a27d-9fb9a7843d5d" containerName="installer" Mar 19 11:57:29.496783 master-0 kubenswrapper[6932]: I0319 11:57:29.496740 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bde080b-3820-463f-a27d-9fb9a7843d5d" containerName="installer" Mar 19 11:57:29.496783 master-0 kubenswrapper[6932]: E0319 11:57:29.496762 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e508a43-99db-49eb-bf4e-e3e6a0f49761" containerName="installer" Mar 19 11:57:29.496783 master-0 kubenswrapper[6932]: I0319 11:57:29.496770 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e508a43-99db-49eb-bf4e-e3e6a0f49761" containerName="installer" Mar 19 11:57:29.496783 master-0 kubenswrapper[6932]: E0319 11:57:29.496778 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c385dd73-4a25-4827-9c8f-d923afc782b7" containerName="installer" Mar 19 11:57:29.496783 master-0 kubenswrapper[6932]: I0319 11:57:29.496784 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="c385dd73-4a25-4827-9c8f-d923afc782b7" containerName="installer" Mar 19 11:57:29.497124 master-0 kubenswrapper[6932]: I0319 11:57:29.496874 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bde080b-3820-463f-a27d-9fb9a7843d5d" containerName="installer" Mar 19 11:57:29.497124 master-0 kubenswrapper[6932]: I0319 11:57:29.496886 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="870e66ff-82ed-4c91-8197-dddcb78048c2" containerName="installer" Mar 19 11:57:29.497124 master-0 kubenswrapper[6932]: I0319 11:57:29.496894 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e508a43-99db-49eb-bf4e-e3e6a0f49761" containerName="installer" Mar 19 11:57:29.497124 master-0 kubenswrapper[6932]: I0319 11:57:29.496903 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="c385dd73-4a25-4827-9c8f-d923afc782b7" containerName="installer" Mar 19 11:57:29.497339 master-0 kubenswrapper[6932]: I0319 11:57:29.497307 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 11:57:29.501413 master-0 kubenswrapper[6932]: I0319 11:57:29.501317 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-kpv7f" Mar 19 11:57:29.501592 master-0 kubenswrapper[6932]: I0319 11:57:29.501340 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 11:57:29.518766 master-0 kubenswrapper[6932]: I0319 11:57:29.518691 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 19 11:57:29.649541 master-0 kubenswrapper[6932]: I0319 11:57:29.649474 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/76f8b2b8-4315-431b-a2b9-deab1bfc7884-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"76f8b2b8-4315-431b-a2b9-deab1bfc7884\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 11:57:29.649871 master-0 kubenswrapper[6932]: I0319 11:57:29.649855 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76f8b2b8-4315-431b-a2b9-deab1bfc7884-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"76f8b2b8-4315-431b-a2b9-deab1bfc7884\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 11:57:29.650025 master-0 kubenswrapper[6932]: I0319 11:57:29.650006 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76f8b2b8-4315-431b-a2b9-deab1bfc7884-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"76f8b2b8-4315-431b-a2b9-deab1bfc7884\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 11:57:29.751515 master-0 kubenswrapper[6932]: I0319 11:57:29.751361 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76f8b2b8-4315-431b-a2b9-deab1bfc7884-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"76f8b2b8-4315-431b-a2b9-deab1bfc7884\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 11:57:29.751901 master-0 kubenswrapper[6932]: I0319 11:57:29.751861 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/76f8b2b8-4315-431b-a2b9-deab1bfc7884-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"76f8b2b8-4315-431b-a2b9-deab1bfc7884\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 11:57:29.752026 master-0 kubenswrapper[6932]: I0319 11:57:29.751981 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/76f8b2b8-4315-431b-a2b9-deab1bfc7884-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"76f8b2b8-4315-431b-a2b9-deab1bfc7884\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 11:57:29.752130 master-0 kubenswrapper[6932]: I0319 11:57:29.752114 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76f8b2b8-4315-431b-a2b9-deab1bfc7884-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"76f8b2b8-4315-431b-a2b9-deab1bfc7884\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 11:57:29.752321 master-0 kubenswrapper[6932]: I0319 11:57:29.752145 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76f8b2b8-4315-431b-a2b9-deab1bfc7884-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"76f8b2b8-4315-431b-a2b9-deab1bfc7884\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 11:57:29.772136 master-0 kubenswrapper[6932]: I0319 11:57:29.772054 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76f8b2b8-4315-431b-a2b9-deab1bfc7884-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"76f8b2b8-4315-431b-a2b9-deab1bfc7884\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 11:57:29.816860 master-0 kubenswrapper[6932]: I0319 11:57:29.816783 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 11:57:30.213050 master-0 kubenswrapper[6932]: I0319 11:57:30.212988 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 19 11:57:30.218559 master-0 kubenswrapper[6932]: W0319 11:57:30.218508 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod76f8b2b8_4315_431b_a2b9_deab1bfc7884.slice/crio-95429c69af4c1bb1223ac6239bd6b9c564ae0325597ea2510ccb844f342bde54 WatchSource:0}: Error finding container 95429c69af4c1bb1223ac6239bd6b9c564ae0325597ea2510ccb844f342bde54: Status 404 returned error can't find the container with id 95429c69af4c1bb1223ac6239bd6b9c564ae0325597ea2510ccb844f342bde54 Mar 19 11:57:30.460963 master-0 kubenswrapper[6932]: I0319 11:57:30.460884 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"76f8b2b8-4315-431b-a2b9-deab1bfc7884","Type":"ContainerStarted","Data":"95429c69af4c1bb1223ac6239bd6b9c564ae0325597ea2510ccb844f342bde54"} Mar 19 11:57:31.467902 master-0 kubenswrapper[6932]: I0319 11:57:31.467774 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"76f8b2b8-4315-431b-a2b9-deab1bfc7884","Type":"ContainerStarted","Data":"182613f47c988603fa253da695bf14c0d843684239f6f10a5d0b78872f67dc68"} Mar 19 11:57:31.570137 master-0 kubenswrapper[6932]: I0319 11:57:31.570050 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" podStartSLOduration=2.570031486 podStartE2EDuration="2.570031486s" podCreationTimestamp="2026-03-19 11:57:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:57:31.566440849 +0000 UTC m=+275.925501121" watchObservedRunningTime="2026-03-19 11:57:31.570031486 +0000 UTC m=+275.929091708" Mar 19 11:57:32.478420 master-0 kubenswrapper[6932]: E0319 11:57:32.478234 6932 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T11:57:22Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T11:57:22Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T11:57:22Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T11:57:22Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\\\"],\\\"sizeBytes\\\":1637455533},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a55ec7ec64efd0f595d8084377b7e463a1807829b7617e5d4a9092dcd924c36\\\"],\\\"sizeBytes\\\":1238100502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:af0fe0ca926422a6471d5bf22fc0e682c36c24fba05496a3bdfac0b7d3733015\\\"],\\\"sizeBytes\\\":991832673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\"],\\\"sizeBytes\\\":943841779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0a09f5a3ba4f60cce0145769509bab92553c8075d210af4ac058965d2ae11efa\\\"],\\\"sizeBytes\\\":876160834},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6bba3f73c0066e42b24839e0d29f5dce2f36436f0a11f9f5e1029bccc5ed6578\\\"],\\\"sizeBytes\\\":862657321},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a9e8da5c6114f062b814936d4db7a47a04d248e160d6bb28ad4e4a081496ee4\\\"],\\\"sizeBytes\\\":772943435},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1e1faad2d9167d84e23585c1cea5962301845548043cf09578f943f79ca98016\\\"],\\\"sizeBytes\\\":687949580},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5e12e4dc52214d3ada5ba5106caebe079eac1d9292c2571a5fe83411ce8e900d\\\"],\\\"sizeBytes\\\":683195416},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa5e782406f71c048b1ac3a4bf5d1227ff4be81111114083ad4c7a209c6bfb5a\\\"],\\\"sizeBytes\\\":677942383},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ec8fd46dfb35ed10e8f98933166f69ce579c2f35b8db03d21e4c34fc544553e4\\\"],\\\"sizeBytes\\\":621648710},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998\\\"],\\\"sizeBytes\\\":589386806},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55\\\"],\\\"sizeBytes\\\":582154903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982\\\"],\\\"sizeBytes\\\":558211175},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"],\\\"sizeBytes\\\":548752816},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\"],\\\"sizeBytes\\\":529326739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b4c9cf268bb7abef7af187cd775d3f74d0bd33626250095428d53b705ee946\\\"],\\\"sizeBytes\\\":528956487},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\"],\\\"sizeBytes\\\":518384969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:59727c4b3fef19e5149675cf3350735bbfe2c6588a57654b2e4552dd719f58b1\\\"],\\\"sizeBytes\\\":517999161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\"],\\\"sizeBytes\\\":514984269},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bfe394b58ec6195de8b8420e781b7630d85a412b9112d892fea903f92b783427\\\"],\\\"sizeBytes\\\":513221333},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"],\\\"sizeBytes\\\":512274055},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:77fff570657d2fa0bfb709b2c8b6665bae0bf90a2be981d8dbca56c674715098\\\"],\\\"sizeBytes\\\":511227324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adb9f6f2fd701863c7caed747df43f83d3569ba9388cfa33ea7219ac6a606b11\\\"],\\\"sizeBytes\\\":511164375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458\\\"],\\\"sizeBytes\\\":508888171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263\\\"],\\\"sizeBytes\\\":508544745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71\\\"],\\\"sizeBytes\\\":507972093},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3\\\"],\\\"sizeBytes\\\":506480167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\"],\\\"sizeBytes\\\":506395599},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7d4a034950346bcd4e36e9e2f1343e0cf7a10cf544963f33d09c7eb2a1bfc634\\\"],\\\"sizeBytes\\\":505345991},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\\\"],\\\"sizeBytes\\\":505246690},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\\\"],\\\"sizeBytes\\\":504625081},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69\\\"],\\\"sizeBytes\\\":495994673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023\\\"],\\\"sizeBytes\\\":495065340},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:446bedea4916d3c1ee52be94137e484659e9561bd1de95c8189eee279aae984b\\\"],\\\"sizeBytes\\\":487096305},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a746a87b784ea1caa278fd0e012554f9df520b6fff665ea0bc4c83f487fed113\\\"],\\\"sizeBytes\\\":484450894},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c4d5a681595e428ff4b5083648c13615eed80be9084a3d3fc68a0295079cb12\\\"],\\\"sizeBytes\\\":484187929},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:908eaaf624959bc7645f6d585d160431d1efb070e9a1f37fefed73a3be42b0d3\\\"],\\\"sizeBytes\\\":470681292},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ea5c8a93f30e0a4932da5697d22c0da7eda9a7035c0555eb006b6755e62bb2fc\\\"],\\\"sizeBytes\\\":468265024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\\\"],\\\"sizeBytes\\\":465090934},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9609c00207cc4db97f0fd6162eb429d7f81654137f020a677e30cba26a887a24\\\"],\\\"sizeBytes\\\":463705930},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:632e80bba5077068ecca05fddb95aedebad4493af6f36152c01c6ae490975b62\\\"],\\\"sizeBytes\\\":458126937},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bcb08821551e9a5b9f82aa794bcea673279cefb93cb47492e19ccac5e2cf18fe\\\"],\\\"sizeBytes\\\":456576198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:759fb1d5353dbbadd443f38631d977ca3aed9787b873be05cc9660532a252739\\\"],\\\"sizeBytes\\\":448828620},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3062f6485aec4770e60852b535c69a42527b305161fe856499c8658ead6d1e85\\\"],\\\"sizeBytes\\\":448042136},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:951ecfeba9b2da4b653034d09275f925396a79c2d8461b8a7c71c776fee67ba0\\\"],\\\"sizeBytes\\\":443272037},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:292560e2d80b460468bb19fe0ddf289767c655027b03a76ee6c40c91ffe4c483\\\"],\\\"sizeBytes\\\":438654374},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0e66fd50be6f83ce321a566dfb76f3725b597374077d5af13813b928f6b1267e\\\"],\\\"sizeBytes\\\":411587146},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e3a494212f1ba17f0f0980eef583218330eccb56eadf6b8cb0548c76d99b5014\\\"],\\\"sizeBytes\\\":407347125},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:53d66d524ca3e787d8dbe30dbc4d9b8612c9cebd505ccb4375a8441814e85422\\\"],\\\"sizeBytes\\\":396521761}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:57:34.273360 master-0 kubenswrapper[6932]: I0319 11:57:34.273274 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rrvxk" Mar 19 11:57:34.273360 master-0 kubenswrapper[6932]: I0319 11:57:34.273364 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rrvxk" Mar 19 11:57:34.307966 master-0 kubenswrapper[6932]: I0319 11:57:34.307896 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rrvxk" Mar 19 11:57:34.523619 master-0 kubenswrapper[6932]: I0319 11:57:34.523449 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rrvxk" Mar 19 11:57:35.323833 master-0 kubenswrapper[6932]: I0319 11:57:35.323779 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pn57d" Mar 19 11:57:35.323833 master-0 kubenswrapper[6932]: I0319 11:57:35.323840 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pn57d" Mar 19 11:57:35.359828 master-0 kubenswrapper[6932]: I0319 11:57:35.359744 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pn57d" Mar 19 11:57:35.521038 master-0 kubenswrapper[6932]: I0319 11:57:35.520976 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pn57d" Mar 19 11:57:35.901348 master-0 kubenswrapper[6932]: I0319 11:57:35.901284 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pn57d"] Mar 19 11:57:35.904320 master-0 kubenswrapper[6932]: I0319 11:57:35.904269 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rrvxk"] Mar 19 11:57:36.491700 master-0 kubenswrapper[6932]: I0319 11:57:36.491557 6932 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rrvxk" podUID="64f5cbf1-f761-4531-8e5c-1f9b318b0cb9" containerName="registry-server" containerID="cri-o://405681b4e087bf984381117ea7b1c3ba995ceb7dc4c6f465e5df580e2befefed" gracePeriod=2 Mar 19 11:57:36.887413 master-0 kubenswrapper[6932]: I0319 11:57:36.887307 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rrvxk" Mar 19 11:57:37.047302 master-0 kubenswrapper[6932]: I0319 11:57:37.047219 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9-utilities\") pod \"64f5cbf1-f761-4531-8e5c-1f9b318b0cb9\" (UID: \"64f5cbf1-f761-4531-8e5c-1f9b318b0cb9\") " Mar 19 11:57:37.047632 master-0 kubenswrapper[6932]: I0319 11:57:37.047417 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9-catalog-content\") pod \"64f5cbf1-f761-4531-8e5c-1f9b318b0cb9\" (UID: \"64f5cbf1-f761-4531-8e5c-1f9b318b0cb9\") " Mar 19 11:57:37.047632 master-0 kubenswrapper[6932]: I0319 11:57:37.047479 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsdjb\" (UniqueName: \"kubernetes.io/projected/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9-kube-api-access-lsdjb\") pod \"64f5cbf1-f761-4531-8e5c-1f9b318b0cb9\" (UID: \"64f5cbf1-f761-4531-8e5c-1f9b318b0cb9\") " Mar 19 11:57:37.048272 master-0 kubenswrapper[6932]: I0319 11:57:37.048203 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9-utilities" (OuterVolumeSpecName: "utilities") pod "64f5cbf1-f761-4531-8e5c-1f9b318b0cb9" (UID: "64f5cbf1-f761-4531-8e5c-1f9b318b0cb9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:57:37.051058 master-0 kubenswrapper[6932]: I0319 11:57:37.051000 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9-kube-api-access-lsdjb" (OuterVolumeSpecName: "kube-api-access-lsdjb") pod "64f5cbf1-f761-4531-8e5c-1f9b318b0cb9" (UID: "64f5cbf1-f761-4531-8e5c-1f9b318b0cb9"). InnerVolumeSpecName "kube-api-access-lsdjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:57:37.148241 master-0 kubenswrapper[6932]: I0319 11:57:37.148131 6932 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsdjb\" (UniqueName: \"kubernetes.io/projected/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9-kube-api-access-lsdjb\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:37.148241 master-0 kubenswrapper[6932]: I0319 11:57:37.148164 6932 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9-utilities\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:37.497817 master-0 kubenswrapper[6932]: I0319 11:57:37.497711 6932 generic.go:334] "Generic (PLEG): container finished" podID="64f5cbf1-f761-4531-8e5c-1f9b318b0cb9" containerID="405681b4e087bf984381117ea7b1c3ba995ceb7dc4c6f465e5df580e2befefed" exitCode=0 Mar 19 11:57:37.497817 master-0 kubenswrapper[6932]: I0319 11:57:37.497772 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rrvxk" event={"ID":"64f5cbf1-f761-4531-8e5c-1f9b318b0cb9","Type":"ContainerDied","Data":"405681b4e087bf984381117ea7b1c3ba995ceb7dc4c6f465e5df580e2befefed"} Mar 19 11:57:37.497817 master-0 kubenswrapper[6932]: I0319 11:57:37.497810 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rrvxk" event={"ID":"64f5cbf1-f761-4531-8e5c-1f9b318b0cb9","Type":"ContainerDied","Data":"c79535746841947fd388ae5680d8b49dbd8b1a4914cee13d44e9d1f304783117"} Mar 19 11:57:37.497817 master-0 kubenswrapper[6932]: I0319 11:57:37.497823 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rrvxk" Mar 19 11:57:37.498538 master-0 kubenswrapper[6932]: I0319 11:57:37.497830 6932 scope.go:117] "RemoveContainer" containerID="405681b4e087bf984381117ea7b1c3ba995ceb7dc4c6f465e5df580e2befefed" Mar 19 11:57:37.498538 master-0 kubenswrapper[6932]: I0319 11:57:37.497927 6932 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pn57d" podUID="ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c" containerName="registry-server" containerID="cri-o://450b27691871726f5f4969eb74205be5dae4f706974b3e180e9633b26833df06" gracePeriod=2 Mar 19 11:57:37.511756 master-0 kubenswrapper[6932]: I0319 11:57:37.511709 6932 scope.go:117] "RemoveContainer" containerID="0cf8f5f7d033dcecb519d5f330a3bdb1844539d062ac064cc4e8a74d131c9002" Mar 19 11:57:37.525886 master-0 kubenswrapper[6932]: I0319 11:57:37.525760 6932 scope.go:117] "RemoveContainer" containerID="a9e27f322bd00992bb4993c33f61aa9b49e3677fa4a2096a53f4ef8f27534fba" Mar 19 11:57:37.537304 master-0 kubenswrapper[6932]: I0319 11:57:37.537258 6932 scope.go:117] "RemoveContainer" containerID="405681b4e087bf984381117ea7b1c3ba995ceb7dc4c6f465e5df580e2befefed" Mar 19 11:57:37.537906 master-0 kubenswrapper[6932]: E0319 11:57:37.537681 6932 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"405681b4e087bf984381117ea7b1c3ba995ceb7dc4c6f465e5df580e2befefed\": container with ID starting with 405681b4e087bf984381117ea7b1c3ba995ceb7dc4c6f465e5df580e2befefed not found: ID does not exist" containerID="405681b4e087bf984381117ea7b1c3ba995ceb7dc4c6f465e5df580e2befefed" Mar 19 11:57:37.537906 master-0 kubenswrapper[6932]: I0319 11:57:37.537753 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"405681b4e087bf984381117ea7b1c3ba995ceb7dc4c6f465e5df580e2befefed"} err="failed to get container status \"405681b4e087bf984381117ea7b1c3ba995ceb7dc4c6f465e5df580e2befefed\": rpc error: code = NotFound desc = could not find container \"405681b4e087bf984381117ea7b1c3ba995ceb7dc4c6f465e5df580e2befefed\": container with ID starting with 405681b4e087bf984381117ea7b1c3ba995ceb7dc4c6f465e5df580e2befefed not found: ID does not exist" Mar 19 11:57:37.537906 master-0 kubenswrapper[6932]: I0319 11:57:37.537795 6932 scope.go:117] "RemoveContainer" containerID="0cf8f5f7d033dcecb519d5f330a3bdb1844539d062ac064cc4e8a74d131c9002" Mar 19 11:57:37.538159 master-0 kubenswrapper[6932]: E0319 11:57:37.538125 6932 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cf8f5f7d033dcecb519d5f330a3bdb1844539d062ac064cc4e8a74d131c9002\": container with ID starting with 0cf8f5f7d033dcecb519d5f330a3bdb1844539d062ac064cc4e8a74d131c9002 not found: ID does not exist" containerID="0cf8f5f7d033dcecb519d5f330a3bdb1844539d062ac064cc4e8a74d131c9002" Mar 19 11:57:37.538215 master-0 kubenswrapper[6932]: I0319 11:57:37.538159 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cf8f5f7d033dcecb519d5f330a3bdb1844539d062ac064cc4e8a74d131c9002"} err="failed to get container status \"0cf8f5f7d033dcecb519d5f330a3bdb1844539d062ac064cc4e8a74d131c9002\": rpc error: code = NotFound desc = could not find container \"0cf8f5f7d033dcecb519d5f330a3bdb1844539d062ac064cc4e8a74d131c9002\": container with ID starting with 0cf8f5f7d033dcecb519d5f330a3bdb1844539d062ac064cc4e8a74d131c9002 not found: ID does not exist" Mar 19 11:57:37.538215 master-0 kubenswrapper[6932]: I0319 11:57:37.538175 6932 scope.go:117] "RemoveContainer" containerID="a9e27f322bd00992bb4993c33f61aa9b49e3677fa4a2096a53f4ef8f27534fba" Mar 19 11:57:37.538456 master-0 kubenswrapper[6932]: E0319 11:57:37.538438 6932 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9e27f322bd00992bb4993c33f61aa9b49e3677fa4a2096a53f4ef8f27534fba\": container with ID starting with a9e27f322bd00992bb4993c33f61aa9b49e3677fa4a2096a53f4ef8f27534fba not found: ID does not exist" containerID="a9e27f322bd00992bb4993c33f61aa9b49e3677fa4a2096a53f4ef8f27534fba" Mar 19 11:57:37.538518 master-0 kubenswrapper[6932]: I0319 11:57:37.538455 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9e27f322bd00992bb4993c33f61aa9b49e3677fa4a2096a53f4ef8f27534fba"} err="failed to get container status \"a9e27f322bd00992bb4993c33f61aa9b49e3677fa4a2096a53f4ef8f27534fba\": rpc error: code = NotFound desc = could not find container \"a9e27f322bd00992bb4993c33f61aa9b49e3677fa4a2096a53f4ef8f27534fba\": container with ID starting with a9e27f322bd00992bb4993c33f61aa9b49e3677fa4a2096a53f4ef8f27534fba not found: ID does not exist" Mar 19 11:57:37.640921 master-0 kubenswrapper[6932]: I0319 11:57:37.640835 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64f5cbf1-f761-4531-8e5c-1f9b318b0cb9" (UID: "64f5cbf1-f761-4531-8e5c-1f9b318b0cb9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:57:37.659974 master-0 kubenswrapper[6932]: I0319 11:57:37.659922 6932 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:38.151703 master-0 kubenswrapper[6932]: I0319 11:57:38.151494 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:57:38.187621 master-0 kubenswrapper[6932]: I0319 11:57:38.187500 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rrvxk"] Mar 19 11:57:38.192636 master-0 kubenswrapper[6932]: I0319 11:57:38.192569 6932 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rrvxk"] Mar 19 11:57:38.201699 master-0 kubenswrapper[6932]: I0319 11:57:38.201631 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ccbc5"] Mar 19 11:57:38.202051 master-0 kubenswrapper[6932]: E0319 11:57:38.201910 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64f5cbf1-f761-4531-8e5c-1f9b318b0cb9" containerName="extract-content" Mar 19 11:57:38.202051 master-0 kubenswrapper[6932]: I0319 11:57:38.201923 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="64f5cbf1-f761-4531-8e5c-1f9b318b0cb9" containerName="extract-content" Mar 19 11:57:38.202051 master-0 kubenswrapper[6932]: E0319 11:57:38.201952 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64f5cbf1-f761-4531-8e5c-1f9b318b0cb9" containerName="extract-utilities" Mar 19 11:57:38.202051 master-0 kubenswrapper[6932]: I0319 11:57:38.201959 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="64f5cbf1-f761-4531-8e5c-1f9b318b0cb9" containerName="extract-utilities" Mar 19 11:57:38.202051 master-0 kubenswrapper[6932]: E0319 11:57:38.201979 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64f5cbf1-f761-4531-8e5c-1f9b318b0cb9" containerName="registry-server" Mar 19 11:57:38.202051 master-0 kubenswrapper[6932]: I0319 11:57:38.201988 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="64f5cbf1-f761-4531-8e5c-1f9b318b0cb9" containerName="registry-server" Mar 19 11:57:38.202287 master-0 kubenswrapper[6932]: I0319 11:57:38.202077 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="64f5cbf1-f761-4531-8e5c-1f9b318b0cb9" containerName="registry-server" Mar 19 11:57:38.202831 master-0 kubenswrapper[6932]: I0319 11:57:38.202802 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ccbc5" Mar 19 11:57:38.204537 master-0 kubenswrapper[6932]: I0319 11:57:38.204499 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h668l"] Mar 19 11:57:38.205476 master-0 kubenswrapper[6932]: I0319 11:57:38.205448 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h668l" Mar 19 11:57:38.205929 master-0 kubenswrapper[6932]: I0319 11:57:38.205902 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-6gp54" Mar 19 11:57:38.206946 master-0 kubenswrapper[6932]: I0319 11:57:38.206912 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-6zbld" Mar 19 11:57:38.207267 master-0 kubenswrapper[6932]: I0319 11:57:38.207222 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gwt6h"] Mar 19 11:57:38.208408 master-0 kubenswrapper[6932]: I0319 11:57:38.208359 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwt6h" Mar 19 11:57:38.220535 master-0 kubenswrapper[6932]: I0319 11:57:38.220478 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h668l"] Mar 19 11:57:38.224673 master-0 kubenswrapper[6932]: I0319 11:57:38.224604 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ccbc5"] Mar 19 11:57:38.227096 master-0 kubenswrapper[6932]: I0319 11:57:38.227050 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gwt6h"] Mar 19 11:57:38.366999 master-0 kubenswrapper[6932]: I0319 11:57:38.366918 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1eef757-d63a-4708-8efe-7b27eea1ff63-catalog-content\") pod \"community-operators-h668l\" (UID: \"d1eef757-d63a-4708-8efe-7b27eea1ff63\") " pod="openshift-marketplace/community-operators-h668l" Mar 19 11:57:38.366999 master-0 kubenswrapper[6932]: I0319 11:57:38.366987 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbq7n\" (UniqueName: \"kubernetes.io/projected/d1eef757-d63a-4708-8efe-7b27eea1ff63-kube-api-access-kbq7n\") pod \"community-operators-h668l\" (UID: \"d1eef757-d63a-4708-8efe-7b27eea1ff63\") " pod="openshift-marketplace/community-operators-h668l" Mar 19 11:57:38.367319 master-0 kubenswrapper[6932]: I0319 11:57:38.367039 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf6aab0e-defc-4a4b-8a07-f5af8bf177c4-catalog-content\") pod \"redhat-marketplace-ccbc5\" (UID: \"cf6aab0e-defc-4a4b-8a07-f5af8bf177c4\") " pod="openshift-marketplace/redhat-marketplace-ccbc5" Mar 19 11:57:38.367319 master-0 kubenswrapper[6932]: I0319 11:57:38.367070 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00dd3703-af25-4e71-b20b-b3e153383489-utilities\") pod \"certified-operators-gwt6h\" (UID: \"00dd3703-af25-4e71-b20b-b3e153383489\") " pod="openshift-marketplace/certified-operators-gwt6h" Mar 19 11:57:38.367319 master-0 kubenswrapper[6932]: I0319 11:57:38.367108 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00dd3703-af25-4e71-b20b-b3e153383489-catalog-content\") pod \"certified-operators-gwt6h\" (UID: \"00dd3703-af25-4e71-b20b-b3e153383489\") " pod="openshift-marketplace/certified-operators-gwt6h" Mar 19 11:57:38.367319 master-0 kubenswrapper[6932]: I0319 11:57:38.367139 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf6aab0e-defc-4a4b-8a07-f5af8bf177c4-utilities\") pod \"redhat-marketplace-ccbc5\" (UID: \"cf6aab0e-defc-4a4b-8a07-f5af8bf177c4\") " pod="openshift-marketplace/redhat-marketplace-ccbc5" Mar 19 11:57:38.367319 master-0 kubenswrapper[6932]: I0319 11:57:38.367161 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1eef757-d63a-4708-8efe-7b27eea1ff63-utilities\") pod \"community-operators-h668l\" (UID: \"d1eef757-d63a-4708-8efe-7b27eea1ff63\") " pod="openshift-marketplace/community-operators-h668l" Mar 19 11:57:38.367319 master-0 kubenswrapper[6932]: I0319 11:57:38.367190 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9ddk\" (UniqueName: \"kubernetes.io/projected/00dd3703-af25-4e71-b20b-b3e153383489-kube-api-access-k9ddk\") pod \"certified-operators-gwt6h\" (UID: \"00dd3703-af25-4e71-b20b-b3e153383489\") " pod="openshift-marketplace/certified-operators-gwt6h" Mar 19 11:57:38.367319 master-0 kubenswrapper[6932]: I0319 11:57:38.367213 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-894bt\" (UniqueName: \"kubernetes.io/projected/cf6aab0e-defc-4a4b-8a07-f5af8bf177c4-kube-api-access-894bt\") pod \"redhat-marketplace-ccbc5\" (UID: \"cf6aab0e-defc-4a4b-8a07-f5af8bf177c4\") " pod="openshift-marketplace/redhat-marketplace-ccbc5" Mar 19 11:57:38.468115 master-0 kubenswrapper[6932]: I0319 11:57:38.468032 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1eef757-d63a-4708-8efe-7b27eea1ff63-catalog-content\") pod \"community-operators-h668l\" (UID: \"d1eef757-d63a-4708-8efe-7b27eea1ff63\") " pod="openshift-marketplace/community-operators-h668l" Mar 19 11:57:38.469875 master-0 kubenswrapper[6932]: I0319 11:57:38.468317 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbq7n\" (UniqueName: \"kubernetes.io/projected/d1eef757-d63a-4708-8efe-7b27eea1ff63-kube-api-access-kbq7n\") pod \"community-operators-h668l\" (UID: \"d1eef757-d63a-4708-8efe-7b27eea1ff63\") " pod="openshift-marketplace/community-operators-h668l" Mar 19 11:57:38.469875 master-0 kubenswrapper[6932]: I0319 11:57:38.468357 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf6aab0e-defc-4a4b-8a07-f5af8bf177c4-catalog-content\") pod \"redhat-marketplace-ccbc5\" (UID: \"cf6aab0e-defc-4a4b-8a07-f5af8bf177c4\") " pod="openshift-marketplace/redhat-marketplace-ccbc5" Mar 19 11:57:38.469875 master-0 kubenswrapper[6932]: I0319 11:57:38.468383 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00dd3703-af25-4e71-b20b-b3e153383489-utilities\") pod \"certified-operators-gwt6h\" (UID: \"00dd3703-af25-4e71-b20b-b3e153383489\") " pod="openshift-marketplace/certified-operators-gwt6h" Mar 19 11:57:38.469875 master-0 kubenswrapper[6932]: I0319 11:57:38.468839 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00dd3703-af25-4e71-b20b-b3e153383489-catalog-content\") pod \"certified-operators-gwt6h\" (UID: \"00dd3703-af25-4e71-b20b-b3e153383489\") " pod="openshift-marketplace/certified-operators-gwt6h" Mar 19 11:57:38.469875 master-0 kubenswrapper[6932]: I0319 11:57:38.468862 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf6aab0e-defc-4a4b-8a07-f5af8bf177c4-utilities\") pod \"redhat-marketplace-ccbc5\" (UID: \"cf6aab0e-defc-4a4b-8a07-f5af8bf177c4\") " pod="openshift-marketplace/redhat-marketplace-ccbc5" Mar 19 11:57:38.469875 master-0 kubenswrapper[6932]: I0319 11:57:38.468881 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00dd3703-af25-4e71-b20b-b3e153383489-utilities\") pod \"certified-operators-gwt6h\" (UID: \"00dd3703-af25-4e71-b20b-b3e153383489\") " pod="openshift-marketplace/certified-operators-gwt6h" Mar 19 11:57:38.469875 master-0 kubenswrapper[6932]: I0319 11:57:38.468858 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1eef757-d63a-4708-8efe-7b27eea1ff63-catalog-content\") pod \"community-operators-h668l\" (UID: \"d1eef757-d63a-4708-8efe-7b27eea1ff63\") " pod="openshift-marketplace/community-operators-h668l" Mar 19 11:57:38.469875 master-0 kubenswrapper[6932]: I0319 11:57:38.468924 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1eef757-d63a-4708-8efe-7b27eea1ff63-utilities\") pod \"community-operators-h668l\" (UID: \"d1eef757-d63a-4708-8efe-7b27eea1ff63\") " pod="openshift-marketplace/community-operators-h668l" Mar 19 11:57:38.469875 master-0 kubenswrapper[6932]: I0319 11:57:38.468948 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9ddk\" (UniqueName: \"kubernetes.io/projected/00dd3703-af25-4e71-b20b-b3e153383489-kube-api-access-k9ddk\") pod \"certified-operators-gwt6h\" (UID: \"00dd3703-af25-4e71-b20b-b3e153383489\") " pod="openshift-marketplace/certified-operators-gwt6h" Mar 19 11:57:38.469875 master-0 kubenswrapper[6932]: I0319 11:57:38.468966 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-894bt\" (UniqueName: \"kubernetes.io/projected/cf6aab0e-defc-4a4b-8a07-f5af8bf177c4-kube-api-access-894bt\") pod \"redhat-marketplace-ccbc5\" (UID: \"cf6aab0e-defc-4a4b-8a07-f5af8bf177c4\") " pod="openshift-marketplace/redhat-marketplace-ccbc5" Mar 19 11:57:38.469875 master-0 kubenswrapper[6932]: I0319 11:57:38.469226 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf6aab0e-defc-4a4b-8a07-f5af8bf177c4-utilities\") pod \"redhat-marketplace-ccbc5\" (UID: \"cf6aab0e-defc-4a4b-8a07-f5af8bf177c4\") " pod="openshift-marketplace/redhat-marketplace-ccbc5" Mar 19 11:57:38.469875 master-0 kubenswrapper[6932]: I0319 11:57:38.469290 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00dd3703-af25-4e71-b20b-b3e153383489-catalog-content\") pod \"certified-operators-gwt6h\" (UID: \"00dd3703-af25-4e71-b20b-b3e153383489\") " pod="openshift-marketplace/certified-operators-gwt6h" Mar 19 11:57:38.469875 master-0 kubenswrapper[6932]: I0319 11:57:38.469393 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1eef757-d63a-4708-8efe-7b27eea1ff63-utilities\") pod \"community-operators-h668l\" (UID: \"d1eef757-d63a-4708-8efe-7b27eea1ff63\") " pod="openshift-marketplace/community-operators-h668l" Mar 19 11:57:38.472985 master-0 kubenswrapper[6932]: I0319 11:57:38.472861 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf6aab0e-defc-4a4b-8a07-f5af8bf177c4-catalog-content\") pod \"redhat-marketplace-ccbc5\" (UID: \"cf6aab0e-defc-4a4b-8a07-f5af8bf177c4\") " pod="openshift-marketplace/redhat-marketplace-ccbc5" Mar 19 11:57:38.491212 master-0 kubenswrapper[6932]: I0319 11:57:38.491147 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-894bt\" (UniqueName: \"kubernetes.io/projected/cf6aab0e-defc-4a4b-8a07-f5af8bf177c4-kube-api-access-894bt\") pod \"redhat-marketplace-ccbc5\" (UID: \"cf6aab0e-defc-4a4b-8a07-f5af8bf177c4\") " pod="openshift-marketplace/redhat-marketplace-ccbc5" Mar 19 11:57:38.491775 master-0 kubenswrapper[6932]: I0319 11:57:38.491744 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9ddk\" (UniqueName: \"kubernetes.io/projected/00dd3703-af25-4e71-b20b-b3e153383489-kube-api-access-k9ddk\") pod \"certified-operators-gwt6h\" (UID: \"00dd3703-af25-4e71-b20b-b3e153383489\") " pod="openshift-marketplace/certified-operators-gwt6h" Mar 19 11:57:38.491916 master-0 kubenswrapper[6932]: I0319 11:57:38.491871 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbq7n\" (UniqueName: \"kubernetes.io/projected/d1eef757-d63a-4708-8efe-7b27eea1ff63-kube-api-access-kbq7n\") pod \"community-operators-h668l\" (UID: \"d1eef757-d63a-4708-8efe-7b27eea1ff63\") " pod="openshift-marketplace/community-operators-h668l" Mar 19 11:57:38.511476 master-0 kubenswrapper[6932]: I0319 11:57:38.511306 6932 generic.go:334] "Generic (PLEG): container finished" podID="ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c" containerID="450b27691871726f5f4969eb74205be5dae4f706974b3e180e9633b26833df06" exitCode=0 Mar 19 11:57:38.511476 master-0 kubenswrapper[6932]: I0319 11:57:38.511398 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn57d" event={"ID":"ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c","Type":"ContainerDied","Data":"450b27691871726f5f4969eb74205be5dae4f706974b3e180e9633b26833df06"} Mar 19 11:57:38.521319 master-0 kubenswrapper[6932]: I0319 11:57:38.521225 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ccbc5" Mar 19 11:57:38.540835 master-0 kubenswrapper[6932]: I0319 11:57:38.540767 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h668l" Mar 19 11:57:38.569993 master-0 kubenswrapper[6932]: I0319 11:57:38.569939 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwt6h" Mar 19 11:57:38.578281 master-0 kubenswrapper[6932]: I0319 11:57:38.577895 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pn57d" Mar 19 11:57:38.670059 master-0 kubenswrapper[6932]: I0319 11:57:38.669430 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w2fqh"] Mar 19 11:57:38.670059 master-0 kubenswrapper[6932]: E0319 11:57:38.669671 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c" containerName="extract-utilities" Mar 19 11:57:38.670059 master-0 kubenswrapper[6932]: I0319 11:57:38.669710 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c" containerName="extract-utilities" Mar 19 11:57:38.670059 master-0 kubenswrapper[6932]: E0319 11:57:38.669753 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c" containerName="registry-server" Mar 19 11:57:38.670059 master-0 kubenswrapper[6932]: I0319 11:57:38.669762 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c" containerName="registry-server" Mar 19 11:57:38.670059 master-0 kubenswrapper[6932]: E0319 11:57:38.669777 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c" containerName="extract-content" Mar 19 11:57:38.670059 master-0 kubenswrapper[6932]: I0319 11:57:38.669785 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c" containerName="extract-content" Mar 19 11:57:38.670059 master-0 kubenswrapper[6932]: I0319 11:57:38.669908 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c" containerName="registry-server" Mar 19 11:57:38.670538 master-0 kubenswrapper[6932]: I0319 11:57:38.670340 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c-catalog-content\") pod \"ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c\" (UID: \"ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c\") " Mar 19 11:57:38.670538 master-0 kubenswrapper[6932]: I0319 11:57:38.670382 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g45kt\" (UniqueName: \"kubernetes.io/projected/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c-kube-api-access-g45kt\") pod \"ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c\" (UID: \"ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c\") " Mar 19 11:57:38.670538 master-0 kubenswrapper[6932]: I0319 11:57:38.670429 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c-utilities\") pod \"ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c\" (UID: \"ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c\") " Mar 19 11:57:38.670956 master-0 kubenswrapper[6932]: I0319 11:57:38.670858 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w2fqh" Mar 19 11:57:38.671544 master-0 kubenswrapper[6932]: I0319 11:57:38.671275 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c-utilities" (OuterVolumeSpecName: "utilities") pod "ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c" (UID: "ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:57:38.674233 master-0 kubenswrapper[6932]: I0319 11:57:38.674199 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-cjb2h" Mar 19 11:57:38.675345 master-0 kubenswrapper[6932]: I0319 11:57:38.675250 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c-kube-api-access-g45kt" (OuterVolumeSpecName: "kube-api-access-g45kt") pod "ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c" (UID: "ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c"). InnerVolumeSpecName "kube-api-access-g45kt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:57:38.709786 master-0 kubenswrapper[6932]: I0319 11:57:38.709703 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w2fqh"] Mar 19 11:57:38.735519 master-0 kubenswrapper[6932]: I0319 11:57:38.735396 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c" (UID: "ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:57:38.771325 master-0 kubenswrapper[6932]: I0319 11:57:38.771230 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103-catalog-content\") pod \"redhat-operators-w2fqh\" (UID: \"cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103\") " pod="openshift-marketplace/redhat-operators-w2fqh" Mar 19 11:57:38.771671 master-0 kubenswrapper[6932]: I0319 11:57:38.771339 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg6sp\" (UniqueName: \"kubernetes.io/projected/cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103-kube-api-access-hg6sp\") pod \"redhat-operators-w2fqh\" (UID: \"cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103\") " pod="openshift-marketplace/redhat-operators-w2fqh" Mar 19 11:57:38.771671 master-0 kubenswrapper[6932]: I0319 11:57:38.771385 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103-utilities\") pod \"redhat-operators-w2fqh\" (UID: \"cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103\") " pod="openshift-marketplace/redhat-operators-w2fqh" Mar 19 11:57:38.771671 master-0 kubenswrapper[6932]: I0319 11:57:38.771498 6932 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:38.771671 master-0 kubenswrapper[6932]: I0319 11:57:38.771517 6932 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g45kt\" (UniqueName: \"kubernetes.io/projected/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c-kube-api-access-g45kt\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:38.771671 master-0 kubenswrapper[6932]: I0319 11:57:38.771531 6932 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c-utilities\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:38.852321 master-0 kubenswrapper[6932]: I0319 11:57:38.852253 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:57:38.856677 master-0 kubenswrapper[6932]: I0319 11:57:38.856632 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:57:38.872652 master-0 kubenswrapper[6932]: I0319 11:57:38.872591 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg6sp\" (UniqueName: \"kubernetes.io/projected/cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103-kube-api-access-hg6sp\") pod \"redhat-operators-w2fqh\" (UID: \"cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103\") " pod="openshift-marketplace/redhat-operators-w2fqh" Mar 19 11:57:38.872917 master-0 kubenswrapper[6932]: I0319 11:57:38.872673 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103-utilities\") pod \"redhat-operators-w2fqh\" (UID: \"cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103\") " pod="openshift-marketplace/redhat-operators-w2fqh" Mar 19 11:57:38.872917 master-0 kubenswrapper[6932]: I0319 11:57:38.872771 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103-catalog-content\") pod \"redhat-operators-w2fqh\" (UID: \"cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103\") " pod="openshift-marketplace/redhat-operators-w2fqh" Mar 19 11:57:38.873474 master-0 kubenswrapper[6932]: I0319 11:57:38.873417 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103-utilities\") pod \"redhat-operators-w2fqh\" (UID: \"cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103\") " pod="openshift-marketplace/redhat-operators-w2fqh" Mar 19 11:57:38.873549 master-0 kubenswrapper[6932]: I0319 11:57:38.873442 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103-catalog-content\") pod \"redhat-operators-w2fqh\" (UID: \"cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103\") " pod="openshift-marketplace/redhat-operators-w2fqh" Mar 19 11:57:38.898869 master-0 kubenswrapper[6932]: I0319 11:57:38.893627 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg6sp\" (UniqueName: \"kubernetes.io/projected/cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103-kube-api-access-hg6sp\") pod \"redhat-operators-w2fqh\" (UID: \"cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103\") " pod="openshift-marketplace/redhat-operators-w2fqh" Mar 19 11:57:38.920015 master-0 kubenswrapper[6932]: I0319 11:57:38.919183 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ccbc5"] Mar 19 11:57:38.991506 master-0 kubenswrapper[6932]: I0319 11:57:38.991444 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h668l"] Mar 19 11:57:38.998169 master-0 kubenswrapper[6932]: W0319 11:57:38.998094 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1eef757_d63a_4708_8efe_7b27eea1ff63.slice/crio-e8aeb063908bf0937ac94f698cd72366b310f38d0a1756120e33b67a92cd55de WatchSource:0}: Error finding container e8aeb063908bf0937ac94f698cd72366b310f38d0a1756120e33b67a92cd55de: Status 404 returned error can't find the container with id e8aeb063908bf0937ac94f698cd72366b310f38d0a1756120e33b67a92cd55de Mar 19 11:57:38.998426 master-0 kubenswrapper[6932]: I0319 11:57:38.998344 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w2fqh" Mar 19 11:57:39.046136 master-0 kubenswrapper[6932]: I0319 11:57:39.046082 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gwt6h"] Mar 19 11:57:39.059844 master-0 kubenswrapper[6932]: W0319 11:57:39.059769 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00dd3703_af25_4e71_b20b_b3e153383489.slice/crio-937722eb1a7c864cdcd30f00f097350a31b04584988bb543632ba097925b3bbe WatchSource:0}: Error finding container 937722eb1a7c864cdcd30f00f097350a31b04584988bb543632ba097925b3bbe: Status 404 returned error can't find the container with id 937722eb1a7c864cdcd30f00f097350a31b04584988bb543632ba097925b3bbe Mar 19 11:57:39.438390 master-0 kubenswrapper[6932]: I0319 11:57:39.438341 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w2fqh"] Mar 19 11:57:39.444392 master-0 kubenswrapper[6932]: W0319 11:57:39.444361 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbfd5667_f6f4_4c7c_92b2_ea4ecd0f0103.slice/crio-ebe898baef0ea3cf7f17e803722db21b9281248ac2ac1d6fe40d8e59580a9cee WatchSource:0}: Error finding container ebe898baef0ea3cf7f17e803722db21b9281248ac2ac1d6fe40d8e59580a9cee: Status 404 returned error can't find the container with id ebe898baef0ea3cf7f17e803722db21b9281248ac2ac1d6fe40d8e59580a9cee Mar 19 11:57:39.526886 master-0 kubenswrapper[6932]: I0319 11:57:39.526014 6932 generic.go:334] "Generic (PLEG): container finished" podID="d1eef757-d63a-4708-8efe-7b27eea1ff63" containerID="ec41ae21e96c81776f1b758c8ccd3dc8a175f2bcaaf918fe5635b9c5e4aaf22a" exitCode=0 Mar 19 11:57:39.526886 master-0 kubenswrapper[6932]: I0319 11:57:39.526167 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h668l" event={"ID":"d1eef757-d63a-4708-8efe-7b27eea1ff63","Type":"ContainerDied","Data":"ec41ae21e96c81776f1b758c8ccd3dc8a175f2bcaaf918fe5635b9c5e4aaf22a"} Mar 19 11:57:39.526886 master-0 kubenswrapper[6932]: I0319 11:57:39.526221 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h668l" event={"ID":"d1eef757-d63a-4708-8efe-7b27eea1ff63","Type":"ContainerStarted","Data":"e8aeb063908bf0937ac94f698cd72366b310f38d0a1756120e33b67a92cd55de"} Mar 19 11:57:39.529346 master-0 kubenswrapper[6932]: I0319 11:57:39.529301 6932 generic.go:334] "Generic (PLEG): container finished" podID="00dd3703-af25-4e71-b20b-b3e153383489" containerID="14faec67fbcbd3bef3715a945c5a2d9c7ecc242573ef637c229e56ff09166d0d" exitCode=0 Mar 19 11:57:39.529422 master-0 kubenswrapper[6932]: I0319 11:57:39.529388 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwt6h" event={"ID":"00dd3703-af25-4e71-b20b-b3e153383489","Type":"ContainerDied","Data":"14faec67fbcbd3bef3715a945c5a2d9c7ecc242573ef637c229e56ff09166d0d"} Mar 19 11:57:39.529422 master-0 kubenswrapper[6932]: I0319 11:57:39.529421 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwt6h" event={"ID":"00dd3703-af25-4e71-b20b-b3e153383489","Type":"ContainerStarted","Data":"937722eb1a7c864cdcd30f00f097350a31b04584988bb543632ba097925b3bbe"} Mar 19 11:57:39.534434 master-0 kubenswrapper[6932]: I0319 11:57:39.534400 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pn57d" Mar 19 11:57:39.534585 master-0 kubenswrapper[6932]: I0319 11:57:39.534531 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn57d" event={"ID":"ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c","Type":"ContainerDied","Data":"b65208e52222fdb3f7c61852c66c50026f3faa77384c9fee6973005796070926"} Mar 19 11:57:39.534754 master-0 kubenswrapper[6932]: I0319 11:57:39.534717 6932 scope.go:117] "RemoveContainer" containerID="450b27691871726f5f4969eb74205be5dae4f706974b3e180e9633b26833df06" Mar 19 11:57:39.540768 master-0 kubenswrapper[6932]: I0319 11:57:39.540592 6932 generic.go:334] "Generic (PLEG): container finished" podID="cf6aab0e-defc-4a4b-8a07-f5af8bf177c4" containerID="394e4d00faf263e34f605868a3854ebf366726976f687b2665ba581d5a0e6077" exitCode=0 Mar 19 11:57:39.540875 master-0 kubenswrapper[6932]: I0319 11:57:39.540742 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ccbc5" event={"ID":"cf6aab0e-defc-4a4b-8a07-f5af8bf177c4","Type":"ContainerDied","Data":"394e4d00faf263e34f605868a3854ebf366726976f687b2665ba581d5a0e6077"} Mar 19 11:57:39.540875 master-0 kubenswrapper[6932]: I0319 11:57:39.540815 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ccbc5" event={"ID":"cf6aab0e-defc-4a4b-8a07-f5af8bf177c4","Type":"ContainerStarted","Data":"aa3b1b6a2b92eddacf5dacab6a0147cb12a4e498e9d143158aa50de12bd5c3b7"} Mar 19 11:57:39.543073 master-0 kubenswrapper[6932]: I0319 11:57:39.542668 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2fqh" event={"ID":"cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103","Type":"ContainerStarted","Data":"ebe898baef0ea3cf7f17e803722db21b9281248ac2ac1d6fe40d8e59580a9cee"} Mar 19 11:57:39.546333 master-0 kubenswrapper[6932]: I0319 11:57:39.546233 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:57:39.560575 master-0 kubenswrapper[6932]: I0319 11:57:39.560535 6932 scope.go:117] "RemoveContainer" containerID="8aee032fde2826ca2d43f3e69dc5b88ceb4ae164c2d67f37aeea31d0257a0d16" Mar 19 11:57:39.584072 master-0 kubenswrapper[6932]: I0319 11:57:39.584036 6932 scope.go:117] "RemoveContainer" containerID="2178ca0dec99a5992f2fd9c450e6e82583d472a7030f3cf5c76181f940b5ee3b" Mar 19 11:57:39.609978 master-0 kubenswrapper[6932]: I0319 11:57:39.609895 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pn57d"] Mar 19 11:57:39.611877 master-0 kubenswrapper[6932]: I0319 11:57:39.611833 6932 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pn57d"] Mar 19 11:57:39.878964 master-0 kubenswrapper[6932]: I0319 11:57:39.878887 6932 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64f5cbf1-f761-4531-8e5c-1f9b318b0cb9" path="/var/lib/kubelet/pods/64f5cbf1-f761-4531-8e5c-1f9b318b0cb9/volumes" Mar 19 11:57:39.879561 master-0 kubenswrapper[6932]: I0319 11:57:39.879529 6932 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c" path="/var/lib/kubelet/pods/ad27ee0f-ea6b-46a9-a7fd-7e04e9a3d73c/volumes" Mar 19 11:57:40.562259 master-0 kubenswrapper[6932]: I0319 11:57:40.561100 6932 generic.go:334] "Generic (PLEG): container finished" podID="cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103" containerID="ea1dfb029f622e969a6f7e1a0158a90b12898130ad19366edfc5b84bc32d2910" exitCode=0 Mar 19 11:57:40.562259 master-0 kubenswrapper[6932]: I0319 11:57:40.561210 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2fqh" event={"ID":"cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103","Type":"ContainerDied","Data":"ea1dfb029f622e969a6f7e1a0158a90b12898130ad19366edfc5b84bc32d2910"} Mar 19 11:57:41.569687 master-0 kubenswrapper[6932]: I0319 11:57:41.569631 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwt6h" event={"ID":"00dd3703-af25-4e71-b20b-b3e153383489","Type":"ContainerStarted","Data":"eb9b90eb220564280b7feca2d6bf46f46e6aa93fdc1901db274ac1aaeb65eea5"} Mar 19 11:57:42.577960 master-0 kubenswrapper[6932]: I0319 11:57:42.577911 6932 generic.go:334] "Generic (PLEG): container finished" podID="00dd3703-af25-4e71-b20b-b3e153383489" containerID="eb9b90eb220564280b7feca2d6bf46f46e6aa93fdc1901db274ac1aaeb65eea5" exitCode=0 Mar 19 11:57:42.578580 master-0 kubenswrapper[6932]: I0319 11:57:42.578019 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwt6h" event={"ID":"00dd3703-af25-4e71-b20b-b3e153383489","Type":"ContainerDied","Data":"eb9b90eb220564280b7feca2d6bf46f46e6aa93fdc1901db274ac1aaeb65eea5"} Mar 19 11:57:42.582677 master-0 kubenswrapper[6932]: I0319 11:57:42.582644 6932 generic.go:334] "Generic (PLEG): container finished" podID="d1eef757-d63a-4708-8efe-7b27eea1ff63" containerID="80fbd86a4553b76ec244e68f8d481026639e06b5c8ed869b8aace67f7ab378b7" exitCode=0 Mar 19 11:57:42.582770 master-0 kubenswrapper[6932]: I0319 11:57:42.582687 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h668l" event={"ID":"d1eef757-d63a-4708-8efe-7b27eea1ff63","Type":"ContainerDied","Data":"80fbd86a4553b76ec244e68f8d481026639e06b5c8ed869b8aace67f7ab378b7"} Mar 19 11:57:43.596295 master-0 kubenswrapper[6932]: I0319 11:57:43.596211 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h668l" event={"ID":"d1eef757-d63a-4708-8efe-7b27eea1ff63","Type":"ContainerStarted","Data":"52c46f6274e375280d2d4464acdde76b57bb36f6fd92e4a49368d7246378ad67"} Mar 19 11:57:43.633736 master-0 kubenswrapper[6932]: I0319 11:57:43.633627 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h668l" podStartSLOduration=5.148321401 podStartE2EDuration="8.633602979s" podCreationTimestamp="2026-03-19 11:57:35 +0000 UTC" firstStartedPulling="2026-03-19 11:57:39.530850397 +0000 UTC m=+283.889910619" lastFinishedPulling="2026-03-19 11:57:43.016131975 +0000 UTC m=+287.375192197" observedRunningTime="2026-03-19 11:57:43.63199885 +0000 UTC m=+287.991059072" watchObservedRunningTime="2026-03-19 11:57:43.633602979 +0000 UTC m=+287.992663201" Mar 19 11:57:45.613697 master-0 kubenswrapper[6932]: I0319 11:57:45.613645 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwt6h" event={"ID":"00dd3703-af25-4e71-b20b-b3e153383489","Type":"ContainerStarted","Data":"3fece31375ad47278b524f795e206b6a2c7f8ed283363107337ece722c645840"} Mar 19 11:57:46.621145 master-0 kubenswrapper[6932]: I0319 11:57:46.621070 6932 generic.go:334] "Generic (PLEG): container finished" podID="cf6aab0e-defc-4a4b-8a07-f5af8bf177c4" containerID="0d211d045e5393cc8859f1878115708d215b4f1d3901204a93e27c41a822a537" exitCode=0 Mar 19 11:57:46.621685 master-0 kubenswrapper[6932]: I0319 11:57:46.621198 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ccbc5" event={"ID":"cf6aab0e-defc-4a4b-8a07-f5af8bf177c4","Type":"ContainerDied","Data":"0d211d045e5393cc8859f1878115708d215b4f1d3901204a93e27c41a822a537"} Mar 19 11:57:46.879669 master-0 kubenswrapper[6932]: I0319 11:57:46.872694 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gwt6h" podStartSLOduration=8.010278984 podStartE2EDuration="11.872677129s" podCreationTimestamp="2026-03-19 11:57:35 +0000 UTC" firstStartedPulling="2026-03-19 11:57:39.533366168 +0000 UTC m=+283.892426390" lastFinishedPulling="2026-03-19 11:57:43.395764313 +0000 UTC m=+287.754824535" observedRunningTime="2026-03-19 11:57:45.636649126 +0000 UTC m=+289.995709358" watchObservedRunningTime="2026-03-19 11:57:46.872677129 +0000 UTC m=+291.231737351" Mar 19 11:57:48.541851 master-0 kubenswrapper[6932]: I0319 11:57:48.541760 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h668l" Mar 19 11:57:48.542528 master-0 kubenswrapper[6932]: I0319 11:57:48.542501 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h668l" Mar 19 11:57:48.571068 master-0 kubenswrapper[6932]: I0319 11:57:48.570993 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gwt6h" Mar 19 11:57:48.571391 master-0 kubenswrapper[6932]: I0319 11:57:48.571348 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gwt6h" Mar 19 11:57:48.575923 master-0 kubenswrapper[6932]: I0319 11:57:48.575887 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h668l" Mar 19 11:57:48.610563 master-0 kubenswrapper[6932]: I0319 11:57:48.610493 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gwt6h" Mar 19 11:57:48.688710 master-0 kubenswrapper[6932]: I0319 11:57:48.688653 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h668l" Mar 19 11:57:50.557830 master-0 kubenswrapper[6932]: I0319 11:57:50.557014 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc"] Mar 19 11:57:50.558475 master-0 kubenswrapper[6932]: I0319 11:57:50.558162 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc" Mar 19 11:57:50.565748 master-0 kubenswrapper[6932]: I0319 11:57:50.565683 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 11:57:50.572577 master-0 kubenswrapper[6932]: I0319 11:57:50.566018 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 11:57:50.572577 master-0 kubenswrapper[6932]: I0319 11:57:50.566015 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 11:57:50.572577 master-0 kubenswrapper[6932]: I0319 11:57:50.566178 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 11:57:50.572577 master-0 kubenswrapper[6932]: I0319 11:57:50.566227 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-2l456" Mar 19 11:57:50.572577 master-0 kubenswrapper[6932]: I0319 11:57:50.566184 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 11:57:50.577269 master-0 kubenswrapper[6932]: I0319 11:57:50.576058 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-ssxxd"] Mar 19 11:57:50.577269 master-0 kubenswrapper[6932]: I0319 11:57:50.576881 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-ssxxd" Mar 19 11:57:50.584208 master-0 kubenswrapper[6932]: I0319 11:57:50.580654 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 11:57:50.584208 master-0 kubenswrapper[6932]: I0319 11:57:50.580897 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 11:57:50.584208 master-0 kubenswrapper[6932]: I0319 11:57:50.581020 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-g72px" Mar 19 11:57:50.587747 master-0 kubenswrapper[6932]: I0319 11:57:50.585027 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 11:57:50.615714 master-0 kubenswrapper[6932]: I0319 11:57:50.610377 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-ssxxd"] Mar 19 11:57:50.658924 master-0 kubenswrapper[6932]: I0319 11:57:50.655857 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t"] Mar 19 11:57:50.666284 master-0 kubenswrapper[6932]: I0319 11:57:50.665776 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" Mar 19 11:57:50.670978 master-0 kubenswrapper[6932]: I0319 11:57:50.669874 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv"] Mar 19 11:57:50.670978 master-0 kubenswrapper[6932]: I0319 11:57:50.670828 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" Mar 19 11:57:50.685764 master-0 kubenswrapper[6932]: I0319 11:57:50.672263 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 19 11:57:50.685764 master-0 kubenswrapper[6932]: I0319 11:57:50.672553 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-kn6lc" Mar 19 11:57:50.685764 master-0 kubenswrapper[6932]: I0319 11:57:50.672795 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 19 11:57:50.685764 master-0 kubenswrapper[6932]: I0319 11:57:50.672878 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 11:57:50.685764 master-0 kubenswrapper[6932]: I0319 11:57:50.672948 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 19 11:57:50.685764 master-0 kubenswrapper[6932]: I0319 11:57:50.672957 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 19 11:57:50.685764 master-0 kubenswrapper[6932]: I0319 11:57:50.674542 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 11:57:50.685764 master-0 kubenswrapper[6932]: I0319 11:57:50.674763 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 11:57:50.685764 master-0 kubenswrapper[6932]: I0319 11:57:50.674923 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-9hcb7" Mar 19 11:57:50.685764 master-0 kubenswrapper[6932]: I0319 11:57:50.675044 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 11:57:50.685764 master-0 kubenswrapper[6932]: I0319 11:57:50.675065 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 11:57:50.698949 master-0 kubenswrapper[6932]: I0319 11:57:50.697344 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-htdhf"] Mar 19 11:57:50.698949 master-0 kubenswrapper[6932]: I0319 11:57:50.698434 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-htdhf" Mar 19 11:57:50.698949 master-0 kubenswrapper[6932]: I0319 11:57:50.698901 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 11:57:50.700308 master-0 kubenswrapper[6932]: I0319 11:57:50.700272 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 19 11:57:50.700559 master-0 kubenswrapper[6932]: I0319 11:57:50.700483 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-899lw" Mar 19 11:57:50.721766 master-0 kubenswrapper[6932]: I0319 11:57:50.717062 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6"] Mar 19 11:57:50.721766 master-0 kubenswrapper[6932]: I0319 11:57:50.721420 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:57:50.730742 master-0 kubenswrapper[6932]: I0319 11:57:50.728784 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 19 11:57:50.730742 master-0 kubenswrapper[6932]: I0319 11:57:50.729039 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-c95p8" Mar 19 11:57:50.730742 master-0 kubenswrapper[6932]: I0319 11:57:50.729080 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 19 11:57:50.730742 master-0 kubenswrapper[6932]: I0319 11:57:50.729304 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 19 11:57:50.730742 master-0 kubenswrapper[6932]: I0319 11:57:50.729443 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 19 11:57:50.745164 master-0 kubenswrapper[6932]: I0319 11:57:50.744805 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv"] Mar 19 11:57:50.749925 master-0 kubenswrapper[6932]: I0319 11:57:50.746932 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-htdhf"] Mar 19 11:57:50.751525 master-0 kubenswrapper[6932]: I0319 11:57:50.750327 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6"] Mar 19 11:57:50.754883 master-0 kubenswrapper[6932]: I0319 11:57:50.752172 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb22a965-9b36-40cd-993d-747a3978be8e-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-ssxxd\" (UID: \"bb22a965-9b36-40cd-993d-747a3978be8e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-ssxxd" Mar 19 11:57:50.754883 master-0 kubenswrapper[6932]: I0319 11:57:50.752239 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f5c84321-1399-48d6-916f-38011af8fd94-machine-approver-tls\") pod \"machine-approver-6cb57bb5db-9fbqc\" (UID: \"f5c84321-1399-48d6-916f-38011af8fd94\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc" Mar 19 11:57:50.754883 master-0 kubenswrapper[6932]: I0319 11:57:50.752272 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5c84321-1399-48d6-916f-38011af8fd94-config\") pod \"machine-approver-6cb57bb5db-9fbqc\" (UID: \"f5c84321-1399-48d6-916f-38011af8fd94\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc" Mar 19 11:57:50.754883 master-0 kubenswrapper[6932]: I0319 11:57:50.752390 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f5c84321-1399-48d6-916f-38011af8fd94-auth-proxy-config\") pod \"machine-approver-6cb57bb5db-9fbqc\" (UID: \"f5c84321-1399-48d6-916f-38011af8fd94\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc" Mar 19 11:57:50.754883 master-0 kubenswrapper[6932]: I0319 11:57:50.752430 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b4mx\" (UniqueName: \"kubernetes.io/projected/f5c84321-1399-48d6-916f-38011af8fd94-kube-api-access-2b4mx\") pod \"machine-approver-6cb57bb5db-9fbqc\" (UID: \"f5c84321-1399-48d6-916f-38011af8fd94\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc" Mar 19 11:57:50.754883 master-0 kubenswrapper[6932]: I0319 11:57:50.752479 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p55f\" (UniqueName: \"kubernetes.io/projected/bb22a965-9b36-40cd-993d-747a3978be8e-kube-api-access-5p55f\") pod \"cluster-samples-operator-85f7577d78-ssxxd\" (UID: \"bb22a965-9b36-40cd-993d-747a3978be8e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-ssxxd" Mar 19 11:57:50.758451 master-0 kubenswrapper[6932]: I0319 11:57:50.758207 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls"] Mar 19 11:57:50.760390 master-0 kubenswrapper[6932]: I0319 11:57:50.760369 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls" Mar 19 11:57:50.764607 master-0 kubenswrapper[6932]: I0319 11:57:50.763765 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-pdvk4" Mar 19 11:57:50.764607 master-0 kubenswrapper[6932]: I0319 11:57:50.764092 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 19 11:57:50.764607 master-0 kubenswrapper[6932]: I0319 11:57:50.764317 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 19 11:57:50.764607 master-0 kubenswrapper[6932]: I0319 11:57:50.764468 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 19 11:57:50.764887 master-0 kubenswrapper[6932]: I0319 11:57:50.764753 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 19 11:57:50.773341 master-0 kubenswrapper[6932]: I0319 11:57:50.769491 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls"] Mar 19 11:57:50.804310 master-0 kubenswrapper[6932]: I0319 11:57:50.797830 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f"] Mar 19 11:57:50.804310 master-0 kubenswrapper[6932]: I0319 11:57:50.798885 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6"] Mar 19 11:57:50.804310 master-0 kubenswrapper[6932]: I0319 11:57:50.799591 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" Mar 19 11:57:50.804310 master-0 kubenswrapper[6932]: I0319 11:57:50.800112 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f" Mar 19 11:57:50.804310 master-0 kubenswrapper[6932]: I0319 11:57:50.801563 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 19 11:57:50.804310 master-0 kubenswrapper[6932]: I0319 11:57:50.802012 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-6qwsh" Mar 19 11:57:50.804310 master-0 kubenswrapper[6932]: I0319 11:57:50.802041 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 11:57:50.804310 master-0 kubenswrapper[6932]: I0319 11:57:50.802278 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 11:57:50.804310 master-0 kubenswrapper[6932]: I0319 11:57:50.802421 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 11:57:50.806127 master-0 kubenswrapper[6932]: I0319 11:57:50.806030 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-75w76" Mar 19 11:57:50.806127 master-0 kubenswrapper[6932]: I0319 11:57:50.806061 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 19 11:57:50.835973 master-0 kubenswrapper[6932]: I0319 11:57:50.832366 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f"] Mar 19 11:57:50.838239 master-0 kubenswrapper[6932]: I0319 11:57:50.837718 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6"] Mar 19 11:57:50.843963 master-0 kubenswrapper[6932]: I0319 11:57:50.843903 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-68bf6ff9d6-djfg8"] Mar 19 11:57:50.846392 master-0 kubenswrapper[6932]: I0319 11:57:50.845211 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:57:50.847331 master-0 kubenswrapper[6932]: I0319 11:57:50.847066 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-dfb6p" Mar 19 11:57:50.847331 master-0 kubenswrapper[6932]: I0319 11:57:50.847173 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 19 11:57:50.847550 master-0 kubenswrapper[6932]: I0319 11:57:50.847531 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 19 11:57:50.847806 master-0 kubenswrapper[6932]: I0319 11:57:50.847788 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 19 11:57:50.848034 master-0 kubenswrapper[6932]: I0319 11:57:50.847904 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 19 11:57:50.851858 master-0 kubenswrapper[6932]: I0319 11:57:50.851830 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 19 11:57:50.853842 master-0 kubenswrapper[6932]: I0319 11:57:50.853421 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-images\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-wr65t\" (UID: \"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" Mar 19 11:57:50.855112 master-0 kubenswrapper[6932]: I0319 11:57:50.854790 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5c7eb66-e23e-40df-883c-fed012c07f26-proxy-tls\") pod \"machine-config-operator-84d549f6d5-66wvv\" (UID: \"b5c7eb66-e23e-40df-883c-fed012c07f26\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" Mar 19 11:57:50.855112 master-0 kubenswrapper[6932]: I0319 11:57:50.854869 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx487\" (UniqueName: \"kubernetes.io/projected/b5c7eb66-e23e-40df-883c-fed012c07f26-kube-api-access-tx487\") pod \"machine-config-operator-84d549f6d5-66wvv\" (UID: \"b5c7eb66-e23e-40df-883c-fed012c07f26\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" Mar 19 11:57:50.855112 master-0 kubenswrapper[6932]: I0319 11:57:50.854934 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tb9p\" (UniqueName: \"kubernetes.io/projected/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-kube-api-access-5tb9p\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-wr65t\" (UID: \"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" Mar 19 11:57:50.855112 master-0 kubenswrapper[6932]: I0319 11:57:50.854969 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b5c7eb66-e23e-40df-883c-fed012c07f26-images\") pod \"machine-config-operator-84d549f6d5-66wvv\" (UID: \"b5c7eb66-e23e-40df-883c-fed012c07f26\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" Mar 19 11:57:50.855112 master-0 kubenswrapper[6932]: I0319 11:57:50.855057 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb22a965-9b36-40cd-993d-747a3978be8e-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-ssxxd\" (UID: \"bb22a965-9b36-40cd-993d-747a3978be8e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-ssxxd" Mar 19 11:57:50.855112 master-0 kubenswrapper[6932]: I0319 11:57:50.855101 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/92e401a4-ed2f-46f7-924b-329d7b313e6a-images\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:57:50.855330 master-0 kubenswrapper[6932]: I0319 11:57:50.855127 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f5c84321-1399-48d6-916f-38011af8fd94-machine-approver-tls\") pod \"machine-approver-6cb57bb5db-9fbqc\" (UID: \"f5c84321-1399-48d6-916f-38011af8fd94\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc" Mar 19 11:57:50.855406 master-0 kubenswrapper[6932]: I0319 11:57:50.855150 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-wr65t\" (UID: \"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" Mar 19 11:57:50.855972 master-0 kubenswrapper[6932]: I0319 11:57:50.855952 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5c84321-1399-48d6-916f-38011af8fd94-config\") pod \"machine-approver-6cb57bb5db-9fbqc\" (UID: \"f5c84321-1399-48d6-916f-38011af8fd94\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc" Mar 19 11:57:50.856092 master-0 kubenswrapper[6932]: I0319 11:57:50.856072 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/92e401a4-ed2f-46f7-924b-329d7b313e6a-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:57:50.856185 master-0 kubenswrapper[6932]: I0319 11:57:50.856171 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b5c7eb66-e23e-40df-883c-fed012c07f26-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-66wvv\" (UID: \"b5c7eb66-e23e-40df-883c-fed012c07f26\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" Mar 19 11:57:50.856369 master-0 kubenswrapper[6932]: I0319 11:57:50.856343 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/52bdf7cc-f07d-487e-937c-6567f194947e-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-htdhf\" (UID: \"52bdf7cc-f07d-487e-937c-6567f194947e\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-htdhf" Mar 19 11:57:50.856856 master-0 kubenswrapper[6932]: I0319 11:57:50.856443 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f5c84321-1399-48d6-916f-38011af8fd94-auth-proxy-config\") pod \"machine-approver-6cb57bb5db-9fbqc\" (UID: \"f5c84321-1399-48d6-916f-38011af8fd94\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc" Mar 19 11:57:50.856856 master-0 kubenswrapper[6932]: I0319 11:57:50.856490 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92e401a4-ed2f-46f7-924b-329d7b313e6a-cert\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:57:50.856856 master-0 kubenswrapper[6932]: I0319 11:57:50.856526 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-wr65t\" (UID: \"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" Mar 19 11:57:50.856856 master-0 kubenswrapper[6932]: I0319 11:57:50.856556 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b4mx\" (UniqueName: \"kubernetes.io/projected/f5c84321-1399-48d6-916f-38011af8fd94-kube-api-access-2b4mx\") pod \"machine-approver-6cb57bb5db-9fbqc\" (UID: \"f5c84321-1399-48d6-916f-38011af8fd94\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc" Mar 19 11:57:50.856856 master-0 kubenswrapper[6932]: I0319 11:57:50.856583 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dbmq\" (UniqueName: \"kubernetes.io/projected/52bdf7cc-f07d-487e-937c-6567f194947e-kube-api-access-8dbmq\") pod \"cluster-storage-operator-7d87854d6-htdhf\" (UID: \"52bdf7cc-f07d-487e-937c-6567f194947e\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-htdhf" Mar 19 11:57:50.856856 master-0 kubenswrapper[6932]: I0319 11:57:50.856610 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7nhq\" (UniqueName: \"kubernetes.io/projected/92e401a4-ed2f-46f7-924b-329d7b313e6a-kube-api-access-c7nhq\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:57:50.856856 master-0 kubenswrapper[6932]: I0319 11:57:50.856634 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-wr65t\" (UID: \"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" Mar 19 11:57:50.856856 master-0 kubenswrapper[6932]: I0319 11:57:50.856666 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e401a4-ed2f-46f7-924b-329d7b313e6a-config\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:57:50.856856 master-0 kubenswrapper[6932]: I0319 11:57:50.856717 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p55f\" (UniqueName: \"kubernetes.io/projected/bb22a965-9b36-40cd-993d-747a3978be8e-kube-api-access-5p55f\") pod \"cluster-samples-operator-85f7577d78-ssxxd\" (UID: \"bb22a965-9b36-40cd-993d-747a3978be8e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-ssxxd" Mar 19 11:57:50.858816 master-0 kubenswrapper[6932]: I0319 11:57:50.858241 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f5c84321-1399-48d6-916f-38011af8fd94-auth-proxy-config\") pod \"machine-approver-6cb57bb5db-9fbqc\" (UID: \"f5c84321-1399-48d6-916f-38011af8fd94\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc" Mar 19 11:57:50.858816 master-0 kubenswrapper[6932]: I0319 11:57:50.858561 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5c84321-1399-48d6-916f-38011af8fd94-config\") pod \"machine-approver-6cb57bb5db-9fbqc\" (UID: \"f5c84321-1399-48d6-916f-38011af8fd94\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc" Mar 19 11:57:50.866510 master-0 kubenswrapper[6932]: I0319 11:57:50.860123 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-68bf6ff9d6-djfg8"] Mar 19 11:57:50.866510 master-0 kubenswrapper[6932]: I0319 11:57:50.861008 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f5c84321-1399-48d6-916f-38011af8fd94-machine-approver-tls\") pod \"machine-approver-6cb57bb5db-9fbqc\" (UID: \"f5c84321-1399-48d6-916f-38011af8fd94\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc" Mar 19 11:57:50.866510 master-0 kubenswrapper[6932]: I0319 11:57:50.862207 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb22a965-9b36-40cd-993d-747a3978be8e-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-ssxxd\" (UID: \"bb22a965-9b36-40cd-993d-747a3978be8e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-ssxxd" Mar 19 11:57:50.882300 master-0 kubenswrapper[6932]: I0319 11:57:50.882246 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b4mx\" (UniqueName: \"kubernetes.io/projected/f5c84321-1399-48d6-916f-38011af8fd94-kube-api-access-2b4mx\") pod \"machine-approver-6cb57bb5db-9fbqc\" (UID: \"f5c84321-1399-48d6-916f-38011af8fd94\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc" Mar 19 11:57:50.885957 master-0 kubenswrapper[6932]: I0319 11:57:50.885909 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq"] Mar 19 11:57:50.886773 master-0 kubenswrapper[6932]: I0319 11:57:50.886745 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:57:50.888373 master-0 kubenswrapper[6932]: I0319 11:57:50.888340 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-8h56m" Mar 19 11:57:50.888449 master-0 kubenswrapper[6932]: I0319 11:57:50.888370 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 11:57:50.889018 master-0 kubenswrapper[6932]: I0319 11:57:50.888977 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq"] Mar 19 11:57:50.889585 master-0 kubenswrapper[6932]: I0319 11:57:50.889555 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc" Mar 19 11:57:50.894451 master-0 kubenswrapper[6932]: I0319 11:57:50.894433 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p55f\" (UniqueName: \"kubernetes.io/projected/bb22a965-9b36-40cd-993d-747a3978be8e-kube-api-access-5p55f\") pod \"cluster-samples-operator-85f7577d78-ssxxd\" (UID: \"bb22a965-9b36-40cd-993d-747a3978be8e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-ssxxd" Mar 19 11:57:50.930578 master-0 kubenswrapper[6932]: I0319 11:57:50.930521 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-ssxxd" Mar 19 11:57:50.957715 master-0 kubenswrapper[6932]: I0319 11:57:50.957561 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e401a4-ed2f-46f7-924b-329d7b313e6a-config\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:57:50.957715 master-0 kubenswrapper[6932]: I0319 11:57:50.957619 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-images\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-wr65t\" (UID: \"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" Mar 19 11:57:50.957869 master-0 kubenswrapper[6932]: I0319 11:57:50.957798 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5c7eb66-e23e-40df-883c-fed012c07f26-proxy-tls\") pod \"machine-config-operator-84d549f6d5-66wvv\" (UID: \"b5c7eb66-e23e-40df-883c-fed012c07f26\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" Mar 19 11:57:50.957914 master-0 kubenswrapper[6932]: I0319 11:57:50.957860 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx487\" (UniqueName: \"kubernetes.io/projected/b5c7eb66-e23e-40df-883c-fed012c07f26-kube-api-access-tx487\") pod \"machine-config-operator-84d549f6d5-66wvv\" (UID: \"b5c7eb66-e23e-40df-883c-fed012c07f26\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" Mar 19 11:57:50.957914 master-0 kubenswrapper[6932]: I0319 11:57:50.957900 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/034cad93-a500-4c58-8d97-fa49866a0d5e-snapshots\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:57:50.957998 master-0 kubenswrapper[6932]: I0319 11:57:50.957933 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac09dba7-398c-4b0a-a415-edb73cb4cf30-cert\") pod \"cluster-autoscaler-operator-866dc4744-dnx7f\" (UID: \"ac09dba7-398c-4b0a-a415-edb73cb4cf30\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f" Mar 19 11:57:50.957998 master-0 kubenswrapper[6932]: I0319 11:57:50.957965 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbhv4\" (UniqueName: \"kubernetes.io/projected/ac09dba7-398c-4b0a-a415-edb73cb4cf30-kube-api-access-pbhv4\") pod \"cluster-autoscaler-operator-866dc4744-dnx7f\" (UID: \"ac09dba7-398c-4b0a-a415-edb73cb4cf30\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f" Mar 19 11:57:50.958089 master-0 kubenswrapper[6932]: I0319 11:57:50.958003 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tb9p\" (UniqueName: \"kubernetes.io/projected/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-kube-api-access-5tb9p\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-wr65t\" (UID: \"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" Mar 19 11:57:50.958089 master-0 kubenswrapper[6932]: I0319 11:57:50.958036 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-mhvls\" (UID: \"0cbbe8d0-aafb-499f-a1f4-affcea62c1ab\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls" Mar 19 11:57:50.958089 master-0 kubenswrapper[6932]: I0319 11:57:50.958074 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b5c7eb66-e23e-40df-883c-fed012c07f26-images\") pod \"machine-config-operator-84d549f6d5-66wvv\" (UID: \"b5c7eb66-e23e-40df-883c-fed012c07f26\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" Mar 19 11:57:50.958217 master-0 kubenswrapper[6932]: I0319 11:57:50.958103 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/92e401a4-ed2f-46f7-924b-329d7b313e6a-images\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:57:50.958217 master-0 kubenswrapper[6932]: I0319 11:57:50.958147 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-wr65t\" (UID: \"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" Mar 19 11:57:50.958217 master-0 kubenswrapper[6932]: I0319 11:57:50.958186 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/92e401a4-ed2f-46f7-924b-329d7b313e6a-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:57:50.958217 master-0 kubenswrapper[6932]: I0319 11:57:50.958214 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/034cad93-a500-4c58-8d97-fa49866a0d5e-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:57:50.958391 master-0 kubenswrapper[6932]: I0319 11:57:50.958242 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b5c7eb66-e23e-40df-883c-fed012c07f26-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-66wvv\" (UID: \"b5c7eb66-e23e-40df-883c-fed012c07f26\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" Mar 19 11:57:50.958391 master-0 kubenswrapper[6932]: I0319 11:57:50.958279 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-mhvls\" (UID: \"0cbbe8d0-aafb-499f-a1f4-affcea62c1ab\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls" Mar 19 11:57:50.958391 master-0 kubenswrapper[6932]: I0319 11:57:50.958327 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jptl6\" (UniqueName: \"kubernetes.io/projected/034cad93-a500-4c58-8d97-fa49866a0d5e-kube-api-access-jptl6\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:57:50.958391 master-0 kubenswrapper[6932]: I0319 11:57:50.958359 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-config\") pod \"machine-api-operator-6fbb6cf6f9-jf7p6\" (UID: \"75aedbcd-f6ed-43a1-941b-4b04887ffe8e\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" Mar 19 11:57:50.958564 master-0 kubenswrapper[6932]: I0319 11:57:50.958410 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e401a4-ed2f-46f7-924b-329d7b313e6a-config\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:57:50.958564 master-0 kubenswrapper[6932]: I0319 11:57:50.958461 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-images\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-wr65t\" (UID: \"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" Mar 19 11:57:50.958905 master-0 kubenswrapper[6932]: I0319 11:57:50.958872 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/52bdf7cc-f07d-487e-937c-6567f194947e-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-htdhf\" (UID: \"52bdf7cc-f07d-487e-937c-6567f194947e\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-htdhf" Mar 19 11:57:50.959147 master-0 kubenswrapper[6932]: I0319 11:57:50.959113 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034cad93-a500-4c58-8d97-fa49866a0d5e-serving-cert\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:57:50.959215 master-0 kubenswrapper[6932]: I0319 11:57:50.959177 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-jf7p6\" (UID: \"75aedbcd-f6ed-43a1-941b-4b04887ffe8e\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" Mar 19 11:57:50.959362 master-0 kubenswrapper[6932]: I0319 11:57:50.959312 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd6rv\" (UniqueName: \"kubernetes.io/projected/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-kube-api-access-dd6rv\") pod \"machine-api-operator-6fbb6cf6f9-jf7p6\" (UID: \"75aedbcd-f6ed-43a1-941b-4b04887ffe8e\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" Mar 19 11:57:50.959415 master-0 kubenswrapper[6932]: I0319 11:57:50.959364 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v9bx\" (UniqueName: \"kubernetes.io/projected/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab-kube-api-access-8v9bx\") pod \"cloud-credential-operator-744f9dbf77-mhvls\" (UID: \"0cbbe8d0-aafb-499f-a1f4-affcea62c1ab\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls" Mar 19 11:57:50.959558 master-0 kubenswrapper[6932]: I0319 11:57:50.959514 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-images\") pod \"machine-api-operator-6fbb6cf6f9-jf7p6\" (UID: \"75aedbcd-f6ed-43a1-941b-4b04887ffe8e\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" Mar 19 11:57:50.959653 master-0 kubenswrapper[6932]: I0319 11:57:50.959619 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92e401a4-ed2f-46f7-924b-329d7b313e6a-cert\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:57:50.959713 master-0 kubenswrapper[6932]: I0319 11:57:50.959664 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/034cad93-a500-4c58-8d97-fa49866a0d5e-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:57:50.959713 master-0 kubenswrapper[6932]: I0319 11:57:50.959701 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-wr65t\" (UID: \"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" Mar 19 11:57:50.959829 master-0 kubenswrapper[6932]: I0319 11:57:50.959720 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ac09dba7-398c-4b0a-a415-edb73cb4cf30-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-dnx7f\" (UID: \"ac09dba7-398c-4b0a-a415-edb73cb4cf30\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f" Mar 19 11:57:50.959829 master-0 kubenswrapper[6932]: I0319 11:57:50.959782 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dbmq\" (UniqueName: \"kubernetes.io/projected/52bdf7cc-f07d-487e-937c-6567f194947e-kube-api-access-8dbmq\") pod \"cluster-storage-operator-7d87854d6-htdhf\" (UID: \"52bdf7cc-f07d-487e-937c-6567f194947e\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-htdhf" Mar 19 11:57:50.959829 master-0 kubenswrapper[6932]: I0319 11:57:50.959795 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-wr65t\" (UID: \"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" Mar 19 11:57:50.959829 master-0 kubenswrapper[6932]: I0319 11:57:50.959830 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7nhq\" (UniqueName: \"kubernetes.io/projected/92e401a4-ed2f-46f7-924b-329d7b313e6a-kube-api-access-c7nhq\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:57:50.959987 master-0 kubenswrapper[6932]: I0319 11:57:50.959858 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-wr65t\" (UID: \"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" Mar 19 11:57:50.959987 master-0 kubenswrapper[6932]: I0319 11:57:50.959894 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b5c7eb66-e23e-40df-883c-fed012c07f26-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-66wvv\" (UID: \"b5c7eb66-e23e-40df-883c-fed012c07f26\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" Mar 19 11:57:50.960205 master-0 kubenswrapper[6932]: I0319 11:57:50.960175 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b5c7eb66-e23e-40df-883c-fed012c07f26-images\") pod \"machine-config-operator-84d549f6d5-66wvv\" (UID: \"b5c7eb66-e23e-40df-883c-fed012c07f26\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" Mar 19 11:57:50.960350 master-0 kubenswrapper[6932]: I0319 11:57:50.960335 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/92e401a4-ed2f-46f7-924b-329d7b313e6a-images\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:57:50.960570 master-0 kubenswrapper[6932]: I0319 11:57:50.960543 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-wr65t\" (UID: \"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" Mar 19 11:57:50.962198 master-0 kubenswrapper[6932]: I0319 11:57:50.962173 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5c7eb66-e23e-40df-883c-fed012c07f26-proxy-tls\") pod \"machine-config-operator-84d549f6d5-66wvv\" (UID: \"b5c7eb66-e23e-40df-883c-fed012c07f26\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" Mar 19 11:57:50.962327 master-0 kubenswrapper[6932]: I0319 11:57:50.962285 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-wr65t\" (UID: \"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" Mar 19 11:57:50.963226 master-0 kubenswrapper[6932]: I0319 11:57:50.963174 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92e401a4-ed2f-46f7-924b-329d7b313e6a-cert\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:57:50.963667 master-0 kubenswrapper[6932]: I0319 11:57:50.963639 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/92e401a4-ed2f-46f7-924b-329d7b313e6a-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:57:50.967850 master-0 kubenswrapper[6932]: I0319 11:57:50.967815 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/52bdf7cc-f07d-487e-937c-6567f194947e-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-htdhf\" (UID: \"52bdf7cc-f07d-487e-937c-6567f194947e\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-htdhf" Mar 19 11:57:51.020227 master-0 kubenswrapper[6932]: I0319 11:57:51.016637 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tb9p\" (UniqueName: \"kubernetes.io/projected/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-kube-api-access-5tb9p\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-wr65t\" (UID: \"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" Mar 19 11:57:51.020227 master-0 kubenswrapper[6932]: I0319 11:57:51.016745 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7nhq\" (UniqueName: \"kubernetes.io/projected/92e401a4-ed2f-46f7-924b-329d7b313e6a-kube-api-access-c7nhq\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:57:51.031751 master-0 kubenswrapper[6932]: I0319 11:57:51.021162 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx487\" (UniqueName: \"kubernetes.io/projected/b5c7eb66-e23e-40df-883c-fed012c07f26-kube-api-access-tx487\") pod \"machine-config-operator-84d549f6d5-66wvv\" (UID: \"b5c7eb66-e23e-40df-883c-fed012c07f26\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" Mar 19 11:57:51.031751 master-0 kubenswrapper[6932]: I0319 11:57:51.022180 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dbmq\" (UniqueName: \"kubernetes.io/projected/52bdf7cc-f07d-487e-937c-6567f194947e-kube-api-access-8dbmq\") pod \"cluster-storage-operator-7d87854d6-htdhf\" (UID: \"52bdf7cc-f07d-487e-937c-6567f194947e\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-htdhf" Mar 19 11:57:51.031751 master-0 kubenswrapper[6932]: I0319 11:57:51.022314 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" Mar 19 11:57:51.051753 master-0 kubenswrapper[6932]: I0319 11:57:51.046060 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-htdhf" Mar 19 11:57:51.061751 master-0 kubenswrapper[6932]: I0319 11:57:51.057049 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:57:51.068749 master-0 kubenswrapper[6932]: I0319 11:57:51.063278 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbhv4\" (UniqueName: \"kubernetes.io/projected/ac09dba7-398c-4b0a-a415-edb73cb4cf30-kube-api-access-pbhv4\") pod \"cluster-autoscaler-operator-866dc4744-dnx7f\" (UID: \"ac09dba7-398c-4b0a-a415-edb73cb4cf30\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f" Mar 19 11:57:51.068749 master-0 kubenswrapper[6932]: I0319 11:57:51.063319 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-mhvls\" (UID: \"0cbbe8d0-aafb-499f-a1f4-affcea62c1ab\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls" Mar 19 11:57:51.068749 master-0 kubenswrapper[6932]: I0319 11:57:51.063356 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d41245b-33d4-40f8-bbe1-6d2247e2e335-webhook-cert\") pod \"packageserver-bbf67c86c-n58nq\" (UID: \"6d41245b-33d4-40f8-bbe1-6d2247e2e335\") " pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:57:51.068749 master-0 kubenswrapper[6932]: I0319 11:57:51.063383 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/034cad93-a500-4c58-8d97-fa49866a0d5e-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:57:51.068749 master-0 kubenswrapper[6932]: I0319 11:57:51.063402 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7bq7\" (UniqueName: \"kubernetes.io/projected/6d41245b-33d4-40f8-bbe1-6d2247e2e335-kube-api-access-k7bq7\") pod \"packageserver-bbf67c86c-n58nq\" (UID: \"6d41245b-33d4-40f8-bbe1-6d2247e2e335\") " pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:57:51.068749 master-0 kubenswrapper[6932]: I0319 11:57:51.063421 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-mhvls\" (UID: \"0cbbe8d0-aafb-499f-a1f4-affcea62c1ab\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls" Mar 19 11:57:51.068749 master-0 kubenswrapper[6932]: I0319 11:57:51.063444 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jptl6\" (UniqueName: \"kubernetes.io/projected/034cad93-a500-4c58-8d97-fa49866a0d5e-kube-api-access-jptl6\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:57:51.068749 master-0 kubenswrapper[6932]: I0319 11:57:51.063463 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-config\") pod \"machine-api-operator-6fbb6cf6f9-jf7p6\" (UID: \"75aedbcd-f6ed-43a1-941b-4b04887ffe8e\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" Mar 19 11:57:51.068749 master-0 kubenswrapper[6932]: I0319 11:57:51.063483 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034cad93-a500-4c58-8d97-fa49866a0d5e-serving-cert\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:57:51.068749 master-0 kubenswrapper[6932]: I0319 11:57:51.063501 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-jf7p6\" (UID: \"75aedbcd-f6ed-43a1-941b-4b04887ffe8e\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" Mar 19 11:57:51.068749 master-0 kubenswrapper[6932]: I0319 11:57:51.063516 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6d41245b-33d4-40f8-bbe1-6d2247e2e335-tmpfs\") pod \"packageserver-bbf67c86c-n58nq\" (UID: \"6d41245b-33d4-40f8-bbe1-6d2247e2e335\") " pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:57:51.068749 master-0 kubenswrapper[6932]: I0319 11:57:51.063538 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v9bx\" (UniqueName: \"kubernetes.io/projected/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab-kube-api-access-8v9bx\") pod \"cloud-credential-operator-744f9dbf77-mhvls\" (UID: \"0cbbe8d0-aafb-499f-a1f4-affcea62c1ab\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls" Mar 19 11:57:51.068749 master-0 kubenswrapper[6932]: I0319 11:57:51.063560 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd6rv\" (UniqueName: \"kubernetes.io/projected/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-kube-api-access-dd6rv\") pod \"machine-api-operator-6fbb6cf6f9-jf7p6\" (UID: \"75aedbcd-f6ed-43a1-941b-4b04887ffe8e\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" Mar 19 11:57:51.068749 master-0 kubenswrapper[6932]: I0319 11:57:51.063579 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-images\") pod \"machine-api-operator-6fbb6cf6f9-jf7p6\" (UID: \"75aedbcd-f6ed-43a1-941b-4b04887ffe8e\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" Mar 19 11:57:51.068749 master-0 kubenswrapper[6932]: I0319 11:57:51.063599 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/034cad93-a500-4c58-8d97-fa49866a0d5e-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:57:51.068749 master-0 kubenswrapper[6932]: I0319 11:57:51.063620 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ac09dba7-398c-4b0a-a415-edb73cb4cf30-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-dnx7f\" (UID: \"ac09dba7-398c-4b0a-a415-edb73cb4cf30\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f" Mar 19 11:57:51.068749 master-0 kubenswrapper[6932]: I0319 11:57:51.063647 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d41245b-33d4-40f8-bbe1-6d2247e2e335-apiservice-cert\") pod \"packageserver-bbf67c86c-n58nq\" (UID: \"6d41245b-33d4-40f8-bbe1-6d2247e2e335\") " pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:57:51.068749 master-0 kubenswrapper[6932]: I0319 11:57:51.063672 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/034cad93-a500-4c58-8d97-fa49866a0d5e-snapshots\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:57:51.068749 master-0 kubenswrapper[6932]: I0319 11:57:51.063688 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac09dba7-398c-4b0a-a415-edb73cb4cf30-cert\") pod \"cluster-autoscaler-operator-866dc4744-dnx7f\" (UID: \"ac09dba7-398c-4b0a-a415-edb73cb4cf30\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f" Mar 19 11:57:51.068749 master-0 kubenswrapper[6932]: I0319 11:57:51.068005 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-images\") pod \"machine-api-operator-6fbb6cf6f9-jf7p6\" (UID: \"75aedbcd-f6ed-43a1-941b-4b04887ffe8e\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" Mar 19 11:57:51.069567 master-0 kubenswrapper[6932]: I0319 11:57:51.069005 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-config\") pod \"machine-api-operator-6fbb6cf6f9-jf7p6\" (UID: \"75aedbcd-f6ed-43a1-941b-4b04887ffe8e\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" Mar 19 11:57:51.072810 master-0 kubenswrapper[6932]: I0319 11:57:51.070241 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/034cad93-a500-4c58-8d97-fa49866a0d5e-snapshots\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:57:51.072810 master-0 kubenswrapper[6932]: I0319 11:57:51.071345 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/034cad93-a500-4c58-8d97-fa49866a0d5e-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:57:51.072810 master-0 kubenswrapper[6932]: I0319 11:57:51.071495 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ac09dba7-398c-4b0a-a415-edb73cb4cf30-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-dnx7f\" (UID: \"ac09dba7-398c-4b0a-a415-edb73cb4cf30\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f" Mar 19 11:57:51.072810 master-0 kubenswrapper[6932]: I0319 11:57:51.072573 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-mhvls\" (UID: \"0cbbe8d0-aafb-499f-a1f4-affcea62c1ab\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls" Mar 19 11:57:51.082749 master-0 kubenswrapper[6932]: I0319 11:57:51.073881 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-mhvls\" (UID: \"0cbbe8d0-aafb-499f-a1f4-affcea62c1ab\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls" Mar 19 11:57:51.082749 master-0 kubenswrapper[6932]: I0319 11:57:51.073951 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/034cad93-a500-4c58-8d97-fa49866a0d5e-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:57:51.082749 master-0 kubenswrapper[6932]: I0319 11:57:51.074093 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-jf7p6\" (UID: \"75aedbcd-f6ed-43a1-941b-4b04887ffe8e\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" Mar 19 11:57:51.082749 master-0 kubenswrapper[6932]: I0319 11:57:51.075541 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac09dba7-398c-4b0a-a415-edb73cb4cf30-cert\") pod \"cluster-autoscaler-operator-866dc4744-dnx7f\" (UID: \"ac09dba7-398c-4b0a-a415-edb73cb4cf30\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f" Mar 19 11:57:51.082749 master-0 kubenswrapper[6932]: I0319 11:57:51.082585 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034cad93-a500-4c58-8d97-fa49866a0d5e-serving-cert\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:57:51.104750 master-0 kubenswrapper[6932]: I0319 11:57:51.099630 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jptl6\" (UniqueName: \"kubernetes.io/projected/034cad93-a500-4c58-8d97-fa49866a0d5e-kube-api-access-jptl6\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:57:51.104750 master-0 kubenswrapper[6932]: I0319 11:57:51.104457 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd6rv\" (UniqueName: \"kubernetes.io/projected/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-kube-api-access-dd6rv\") pod \"machine-api-operator-6fbb6cf6f9-jf7p6\" (UID: \"75aedbcd-f6ed-43a1-941b-4b04887ffe8e\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" Mar 19 11:57:51.106824 master-0 kubenswrapper[6932]: I0319 11:57:51.106447 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbhv4\" (UniqueName: \"kubernetes.io/projected/ac09dba7-398c-4b0a-a415-edb73cb4cf30-kube-api-access-pbhv4\") pod \"cluster-autoscaler-operator-866dc4744-dnx7f\" (UID: \"ac09dba7-398c-4b0a-a415-edb73cb4cf30\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f" Mar 19 11:57:51.114979 master-0 kubenswrapper[6932]: I0319 11:57:51.114918 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v9bx\" (UniqueName: \"kubernetes.io/projected/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab-kube-api-access-8v9bx\") pod \"cloud-credential-operator-744f9dbf77-mhvls\" (UID: \"0cbbe8d0-aafb-499f-a1f4-affcea62c1ab\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls" Mar 19 11:57:51.153553 master-0 kubenswrapper[6932]: I0319 11:57:51.153472 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" Mar 19 11:57:51.164920 master-0 kubenswrapper[6932]: I0319 11:57:51.164892 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d41245b-33d4-40f8-bbe1-6d2247e2e335-apiservice-cert\") pod \"packageserver-bbf67c86c-n58nq\" (UID: \"6d41245b-33d4-40f8-bbe1-6d2247e2e335\") " pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:57:51.165030 master-0 kubenswrapper[6932]: I0319 11:57:51.164952 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d41245b-33d4-40f8-bbe1-6d2247e2e335-webhook-cert\") pod \"packageserver-bbf67c86c-n58nq\" (UID: \"6d41245b-33d4-40f8-bbe1-6d2247e2e335\") " pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:57:51.165453 master-0 kubenswrapper[6932]: I0319 11:57:51.165175 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7bq7\" (UniqueName: \"kubernetes.io/projected/6d41245b-33d4-40f8-bbe1-6d2247e2e335-kube-api-access-k7bq7\") pod \"packageserver-bbf67c86c-n58nq\" (UID: \"6d41245b-33d4-40f8-bbe1-6d2247e2e335\") " pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:57:51.165453 master-0 kubenswrapper[6932]: I0319 11:57:51.165331 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6d41245b-33d4-40f8-bbe1-6d2247e2e335-tmpfs\") pod \"packageserver-bbf67c86c-n58nq\" (UID: \"6d41245b-33d4-40f8-bbe1-6d2247e2e335\") " pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:57:51.166037 master-0 kubenswrapper[6932]: I0319 11:57:51.165971 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6d41245b-33d4-40f8-bbe1-6d2247e2e335-tmpfs\") pod \"packageserver-bbf67c86c-n58nq\" (UID: \"6d41245b-33d4-40f8-bbe1-6d2247e2e335\") " pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:57:51.168939 master-0 kubenswrapper[6932]: I0319 11:57:51.168909 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d41245b-33d4-40f8-bbe1-6d2247e2e335-webhook-cert\") pod \"packageserver-bbf67c86c-n58nq\" (UID: \"6d41245b-33d4-40f8-bbe1-6d2247e2e335\") " pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:57:51.169020 master-0 kubenswrapper[6932]: I0319 11:57:51.168949 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d41245b-33d4-40f8-bbe1-6d2247e2e335-apiservice-cert\") pod \"packageserver-bbf67c86c-n58nq\" (UID: \"6d41245b-33d4-40f8-bbe1-6d2247e2e335\") " pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:57:51.180680 master-0 kubenswrapper[6932]: I0319 11:57:51.180475 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7bq7\" (UniqueName: \"kubernetes.io/projected/6d41245b-33d4-40f8-bbe1-6d2247e2e335-kube-api-access-k7bq7\") pod \"packageserver-bbf67c86c-n58nq\" (UID: \"6d41245b-33d4-40f8-bbe1-6d2247e2e335\") " pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:57:51.255592 master-0 kubenswrapper[6932]: I0319 11:57:51.255513 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f" Mar 19 11:57:51.293843 master-0 kubenswrapper[6932]: I0319 11:57:51.293751 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:57:51.299330 master-0 kubenswrapper[6932]: I0319 11:57:51.298858 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" Mar 19 11:57:51.371354 master-0 kubenswrapper[6932]: I0319 11:57:51.371215 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:57:51.394481 master-0 kubenswrapper[6932]: I0319 11:57:51.394429 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls" Mar 19 11:57:53.972980 master-0 kubenswrapper[6932]: I0319 11:57:53.972904 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc"] Mar 19 11:57:54.487813 master-0 kubenswrapper[6932]: W0319 11:57:54.486668 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5c84321_1399_48d6_916f_38011af8fd94.slice/crio-41fd122de6629955b52f32ce925718d564553bd270f16d2cdb7c67de534171c6 WatchSource:0}: Error finding container 41fd122de6629955b52f32ce925718d564553bd270f16d2cdb7c67de534171c6: Status 404 returned error can't find the container with id 41fd122de6629955b52f32ce925718d564553bd270f16d2cdb7c67de534171c6 Mar 19 11:57:54.700382 master-0 kubenswrapper[6932]: I0319 11:57:54.689465 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" event={"ID":"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c","Type":"ContainerStarted","Data":"87604a2e751d08b3e86775f42a82bbd7312bc40226d08e5db1020a8212eac309"} Mar 19 11:57:54.700382 master-0 kubenswrapper[6932]: I0319 11:57:54.693014 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ccbc5" event={"ID":"cf6aab0e-defc-4a4b-8a07-f5af8bf177c4","Type":"ContainerStarted","Data":"f9b67754bf4b7eedd2a3e8cb27b7c47f2487f84c5c7330fdfd1358483f459599"} Mar 19 11:57:54.701708 master-0 kubenswrapper[6932]: I0319 11:57:54.700743 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc" event={"ID":"f5c84321-1399-48d6-916f-38011af8fd94","Type":"ContainerStarted","Data":"41fd122de6629955b52f32ce925718d564553bd270f16d2cdb7c67de534171c6"} Mar 19 11:57:54.743236 master-0 kubenswrapper[6932]: I0319 11:57:54.736069 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ccbc5" podStartSLOduration=1.965614554 podStartE2EDuration="16.736042695s" podCreationTimestamp="2026-03-19 11:57:38 +0000 UTC" firstStartedPulling="2026-03-19 11:57:39.545532054 +0000 UTC m=+283.904592276" lastFinishedPulling="2026-03-19 11:57:54.315960195 +0000 UTC m=+298.675020417" observedRunningTime="2026-03-19 11:57:54.735154643 +0000 UTC m=+299.094214865" watchObservedRunningTime="2026-03-19 11:57:54.736042695 +0000 UTC m=+299.095102917" Mar 19 11:57:54.837035 master-0 kubenswrapper[6932]: I0319 11:57:54.836998 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6"] Mar 19 11:57:54.847602 master-0 kubenswrapper[6932]: W0319 11:57:54.847570 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92e401a4_ed2f_46f7_924b_329d7b313e6a.slice/crio-18f4fa41b32bbdc0315d2c159f68c3407a8234dacc09fe18dea04525d0e88d8c WatchSource:0}: Error finding container 18f4fa41b32bbdc0315d2c159f68c3407a8234dacc09fe18dea04525d0e88d8c: Status 404 returned error can't find the container with id 18f4fa41b32bbdc0315d2c159f68c3407a8234dacc09fe18dea04525d0e88d8c Mar 19 11:57:55.037760 master-0 kubenswrapper[6932]: I0319 11:57:55.036841 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-htdhf"] Mar 19 11:57:55.053935 master-0 kubenswrapper[6932]: I0319 11:57:55.053330 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6"] Mar 19 11:57:55.061989 master-0 kubenswrapper[6932]: I0319 11:57:55.060378 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq"] Mar 19 11:57:55.256441 master-0 kubenswrapper[6932]: I0319 11:57:55.252965 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls"] Mar 19 11:57:55.264993 master-0 kubenswrapper[6932]: W0319 11:57:55.264926 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cbbe8d0_aafb_499f_a1f4_affcea62c1ab.slice/crio-8070d874c8e6aab4717f63db58f142956c8f18d4e16e21f12ce84898692af2f8 WatchSource:0}: Error finding container 8070d874c8e6aab4717f63db58f142956c8f18d4e16e21f12ce84898692af2f8: Status 404 returned error can't find the container with id 8070d874c8e6aab4717f63db58f142956c8f18d4e16e21f12ce84898692af2f8 Mar 19 11:57:55.319078 master-0 kubenswrapper[6932]: I0319 11:57:55.319008 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-68bf6ff9d6-djfg8"] Mar 19 11:57:55.336972 master-0 kubenswrapper[6932]: I0319 11:57:55.336857 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv"] Mar 19 11:57:55.341703 master-0 kubenswrapper[6932]: I0319 11:57:55.341655 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f"] Mar 19 11:57:55.341845 master-0 kubenswrapper[6932]: I0319 11:57:55.341718 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-ssxxd"] Mar 19 11:57:55.356218 master-0 kubenswrapper[6932]: W0319 11:57:55.356167 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac09dba7_398c_4b0a_a415_edb73cb4cf30.slice/crio-94301b494e7fa86c5ac2e6fa986da464195a196e9774c438e5a44b6eb0b525ae WatchSource:0}: Error finding container 94301b494e7fa86c5ac2e6fa986da464195a196e9774c438e5a44b6eb0b525ae: Status 404 returned error can't find the container with id 94301b494e7fa86c5ac2e6fa986da464195a196e9774c438e5a44b6eb0b525ae Mar 19 11:57:55.745639 master-0 kubenswrapper[6932]: I0319 11:57:55.744891 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" event={"ID":"75aedbcd-f6ed-43a1-941b-4b04887ffe8e","Type":"ContainerStarted","Data":"a6232cd9aea1b6981eba3233f5112b2daf3e1f9aa8ba3cf2455d53d45f4aa1c5"} Mar 19 11:57:55.745639 master-0 kubenswrapper[6932]: I0319 11:57:55.744948 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" event={"ID":"75aedbcd-f6ed-43a1-941b-4b04887ffe8e","Type":"ContainerStarted","Data":"492d91fc21d30f80345040a63ee30545a1658028ca8d297dee64246b255c0fcb"} Mar 19 11:57:55.753399 master-0 kubenswrapper[6932]: I0319 11:57:55.751968 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-htdhf" event={"ID":"52bdf7cc-f07d-487e-937c-6567f194947e","Type":"ContainerStarted","Data":"683aa0635e184216531580a563438a1b652c9e9d46d69283fd6cdf0548cf223d"} Mar 19 11:57:55.756409 master-0 kubenswrapper[6932]: I0319 11:57:55.756315 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2fqh" event={"ID":"cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103","Type":"ContainerStarted","Data":"1e03cf34918a9df69167cf80628d7425b9668e84a411e5ec9a6953baa6d085c1"} Mar 19 11:57:55.783155 master-0 kubenswrapper[6932]: I0319 11:57:55.783057 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" event={"ID":"6d41245b-33d4-40f8-bbe1-6d2247e2e335","Type":"ContainerStarted","Data":"147bfaa1e4f28a3d90d383fb29684d03c69e0864161e9ad4e0054fce11cc2690"} Mar 19 11:57:55.783155 master-0 kubenswrapper[6932]: I0319 11:57:55.783125 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" event={"ID":"6d41245b-33d4-40f8-bbe1-6d2247e2e335","Type":"ContainerStarted","Data":"e4a5278beb2dd7685cc80a2eb75df7f2fe99740c2893e28197254b1cb14f8f97"} Mar 19 11:57:55.801073 master-0 kubenswrapper[6932]: I0319 11:57:55.801005 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc" event={"ID":"f5c84321-1399-48d6-916f-38011af8fd94","Type":"ContainerStarted","Data":"3308325303b85ce942fe4ae8099f8a790debc1f12fb249d9487ac008a6d02e50"} Mar 19 11:57:55.806230 master-0 kubenswrapper[6932]: I0319 11:57:55.806115 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" event={"ID":"92e401a4-ed2f-46f7-924b-329d7b313e6a","Type":"ContainerStarted","Data":"18f4fa41b32bbdc0315d2c159f68c3407a8234dacc09fe18dea04525d0e88d8c"} Mar 19 11:57:55.808662 master-0 kubenswrapper[6932]: I0319 11:57:55.808604 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" event={"ID":"b5c7eb66-e23e-40df-883c-fed012c07f26","Type":"ContainerStarted","Data":"eb85bccaf67497361ad0248062217ad1eca679f3251007a3d7c7123017cdafb6"} Mar 19 11:57:55.808662 master-0 kubenswrapper[6932]: I0319 11:57:55.808657 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" event={"ID":"b5c7eb66-e23e-40df-883c-fed012c07f26","Type":"ContainerStarted","Data":"01bd4a2802323b3faf679fc3ea0fe20efacc45eab046badf1be6c2b07116febc"} Mar 19 11:57:55.820508 master-0 kubenswrapper[6932]: I0319 11:57:55.820444 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" event={"ID":"034cad93-a500-4c58-8d97-fa49866a0d5e","Type":"ContainerStarted","Data":"b358fce6bb46e5b5037cb28d6e8fc423fe1541e849c427617b2d5f7f7a209743"} Mar 19 11:57:55.820948 master-0 kubenswrapper[6932]: I0319 11:57:55.820901 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" podStartSLOduration=5.820887577 podStartE2EDuration="5.820887577s" podCreationTimestamp="2026-03-19 11:57:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:57:55.818293564 +0000 UTC m=+300.177353796" watchObservedRunningTime="2026-03-19 11:57:55.820887577 +0000 UTC m=+300.179947799" Mar 19 11:57:55.822201 master-0 kubenswrapper[6932]: I0319 11:57:55.822167 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-ssxxd" event={"ID":"bb22a965-9b36-40cd-993d-747a3978be8e","Type":"ContainerStarted","Data":"604f16a2ad7d04e1bbd75b7eca1988232760bcd65e1311be08e2c7a3cbb4c10a"} Mar 19 11:57:55.825297 master-0 kubenswrapper[6932]: I0319 11:57:55.825258 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f" event={"ID":"ac09dba7-398c-4b0a-a415-edb73cb4cf30","Type":"ContainerStarted","Data":"69df5a6d39135c48aafc52fb3d7d5fe52b6231ca86cf35979ade644f1cb616f0"} Mar 19 11:57:55.825297 master-0 kubenswrapper[6932]: I0319 11:57:55.825287 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f" event={"ID":"ac09dba7-398c-4b0a-a415-edb73cb4cf30","Type":"ContainerStarted","Data":"94301b494e7fa86c5ac2e6fa986da464195a196e9774c438e5a44b6eb0b525ae"} Mar 19 11:57:55.826991 master-0 kubenswrapper[6932]: I0319 11:57:55.826962 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls" event={"ID":"0cbbe8d0-aafb-499f-a1f4-affcea62c1ab","Type":"ContainerStarted","Data":"400c51deb702b6c3bda07b7a788cdd8190f2461aa81e2371266bf9b53e43d20a"} Mar 19 11:57:55.826991 master-0 kubenswrapper[6932]: I0319 11:57:55.826987 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls" event={"ID":"0cbbe8d0-aafb-499f-a1f4-affcea62c1ab","Type":"ContainerStarted","Data":"8070d874c8e6aab4717f63db58f142956c8f18d4e16e21f12ce84898692af2f8"} Mar 19 11:57:55.939971 master-0 kubenswrapper[6932]: I0319 11:57:55.939906 6932 scope.go:117] "RemoveContainer" containerID="7dae6204524503aef6defd496cb7b6d0917403d46739b0545f2e50058742fb7c" Mar 19 11:57:55.964403 master-0 kubenswrapper[6932]: I0319 11:57:55.964327 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" podStartSLOduration=5.964308019 podStartE2EDuration="5.964308019s" podCreationTimestamp="2026-03-19 11:57:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:57:55.918990719 +0000 UTC m=+300.278050951" watchObservedRunningTime="2026-03-19 11:57:55.964308019 +0000 UTC m=+300.323368241" Mar 19 11:57:56.834875 master-0 kubenswrapper[6932]: I0319 11:57:56.834551 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" event={"ID":"b5c7eb66-e23e-40df-883c-fed012c07f26","Type":"ContainerStarted","Data":"c9ec0376c5c9420870ca7d9507750b596b96d81b8bcac85fe27f22a268a5eae7"} Mar 19 11:57:56.839494 master-0 kubenswrapper[6932]: I0319 11:57:56.839334 6932 generic.go:334] "Generic (PLEG): container finished" podID="cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103" containerID="1e03cf34918a9df69167cf80628d7425b9668e84a411e5ec9a6953baa6d085c1" exitCode=0 Mar 19 11:57:56.841878 master-0 kubenswrapper[6932]: I0319 11:57:56.841839 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2fqh" event={"ID":"cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103","Type":"ContainerDied","Data":"1e03cf34918a9df69167cf80628d7425b9668e84a411e5ec9a6953baa6d085c1"} Mar 19 11:57:56.841958 master-0 kubenswrapper[6932]: I0319 11:57:56.841896 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:57:56.853285 master-0 kubenswrapper[6932]: I0319 11:57:56.853150 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:57:57.313938 master-0 kubenswrapper[6932]: I0319 11:57:57.308357 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 19 11:57:57.313938 master-0 kubenswrapper[6932]: I0319 11:57:57.309135 6932 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" podUID="76f8b2b8-4315-431b-a2b9-deab1bfc7884" containerName="installer" containerID="cri-o://182613f47c988603fa253da695bf14c0d843684239f6f10a5d0b78872f67dc68" gracePeriod=30 Mar 19 11:57:58.522680 master-0 kubenswrapper[6932]: I0319 11:57:58.522617 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ccbc5" Mar 19 11:57:58.522680 master-0 kubenswrapper[6932]: I0319 11:57:58.522667 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ccbc5" Mar 19 11:57:58.568639 master-0 kubenswrapper[6932]: I0319 11:57:58.568557 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ccbc5" Mar 19 11:57:58.631059 master-0 kubenswrapper[6932]: I0319 11:57:58.631014 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gwt6h" Mar 19 11:57:59.036076 master-0 kubenswrapper[6932]: E0319 11:57:59.036004 6932 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbfd5667_f6f4_4c7c_92b2_ea4ecd0f0103.slice/crio-1e03cf34918a9df69167cf80628d7425b9668e84a411e5ec9a6953baa6d085c1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbfd5667_f6f4_4c7c_92b2_ea4ecd0f0103.slice/crio-conmon-1e03cf34918a9df69167cf80628d7425b9668e84a411e5ec9a6953baa6d085c1.scope\": RecentStats: unable to find data in memory cache]" Mar 19 11:57:59.227226 master-0 kubenswrapper[6932]: I0319 11:57:59.226738 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-mgzld"] Mar 19 11:57:59.228888 master-0 kubenswrapper[6932]: I0319 11:57:59.228865 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-mgzld" Mar 19 11:57:59.232394 master-0 kubenswrapper[6932]: I0319 11:57:59.232196 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-dnwcp" Mar 19 11:57:59.237264 master-0 kubenswrapper[6932]: I0319 11:57:59.234609 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 11:57:59.420336 master-0 kubenswrapper[6932]: I0319 11:57:59.420197 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/24f71770-714e-4111-9188-ad8663c6baa7-rootfs\") pod \"machine-config-daemon-mgzld\" (UID: \"24f71770-714e-4111-9188-ad8663c6baa7\") " pod="openshift-machine-config-operator/machine-config-daemon-mgzld" Mar 19 11:57:59.420336 master-0 kubenswrapper[6932]: I0319 11:57:59.420293 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24f71770-714e-4111-9188-ad8663c6baa7-proxy-tls\") pod \"machine-config-daemon-mgzld\" (UID: \"24f71770-714e-4111-9188-ad8663c6baa7\") " pod="openshift-machine-config-operator/machine-config-daemon-mgzld" Mar 19 11:57:59.420336 master-0 kubenswrapper[6932]: I0319 11:57:59.420312 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/24f71770-714e-4111-9188-ad8663c6baa7-mcd-auth-proxy-config\") pod \"machine-config-daemon-mgzld\" (UID: \"24f71770-714e-4111-9188-ad8663c6baa7\") " pod="openshift-machine-config-operator/machine-config-daemon-mgzld" Mar 19 11:57:59.420832 master-0 kubenswrapper[6932]: I0319 11:57:59.420348 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m287x\" (UniqueName: \"kubernetes.io/projected/24f71770-714e-4111-9188-ad8663c6baa7-kube-api-access-m287x\") pod \"machine-config-daemon-mgzld\" (UID: \"24f71770-714e-4111-9188-ad8663c6baa7\") " pod="openshift-machine-config-operator/machine-config-daemon-mgzld" Mar 19 11:57:59.522131 master-0 kubenswrapper[6932]: I0319 11:57:59.522090 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/24f71770-714e-4111-9188-ad8663c6baa7-mcd-auth-proxy-config\") pod \"machine-config-daemon-mgzld\" (UID: \"24f71770-714e-4111-9188-ad8663c6baa7\") " pod="openshift-machine-config-operator/machine-config-daemon-mgzld" Mar 19 11:57:59.522435 master-0 kubenswrapper[6932]: I0319 11:57:59.522417 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24f71770-714e-4111-9188-ad8663c6baa7-proxy-tls\") pod \"machine-config-daemon-mgzld\" (UID: \"24f71770-714e-4111-9188-ad8663c6baa7\") " pod="openshift-machine-config-operator/machine-config-daemon-mgzld" Mar 19 11:57:59.522669 master-0 kubenswrapper[6932]: I0319 11:57:59.522647 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m287x\" (UniqueName: \"kubernetes.io/projected/24f71770-714e-4111-9188-ad8663c6baa7-kube-api-access-m287x\") pod \"machine-config-daemon-mgzld\" (UID: \"24f71770-714e-4111-9188-ad8663c6baa7\") " pod="openshift-machine-config-operator/machine-config-daemon-mgzld" Mar 19 11:57:59.522842 master-0 kubenswrapper[6932]: I0319 11:57:59.522822 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/24f71770-714e-4111-9188-ad8663c6baa7-rootfs\") pod \"machine-config-daemon-mgzld\" (UID: \"24f71770-714e-4111-9188-ad8663c6baa7\") " pod="openshift-machine-config-operator/machine-config-daemon-mgzld" Mar 19 11:57:59.523513 master-0 kubenswrapper[6932]: I0319 11:57:59.522941 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/24f71770-714e-4111-9188-ad8663c6baa7-rootfs\") pod \"machine-config-daemon-mgzld\" (UID: \"24f71770-714e-4111-9188-ad8663c6baa7\") " pod="openshift-machine-config-operator/machine-config-daemon-mgzld" Mar 19 11:57:59.523617 master-0 kubenswrapper[6932]: I0319 11:57:59.523265 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/24f71770-714e-4111-9188-ad8663c6baa7-mcd-auth-proxy-config\") pod \"machine-config-daemon-mgzld\" (UID: \"24f71770-714e-4111-9188-ad8663c6baa7\") " pod="openshift-machine-config-operator/machine-config-daemon-mgzld" Mar 19 11:57:59.539391 master-0 kubenswrapper[6932]: I0319 11:57:59.539315 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24f71770-714e-4111-9188-ad8663c6baa7-proxy-tls\") pod \"machine-config-daemon-mgzld\" (UID: \"24f71770-714e-4111-9188-ad8663c6baa7\") " pod="openshift-machine-config-operator/machine-config-daemon-mgzld" Mar 19 11:57:59.540288 master-0 kubenswrapper[6932]: I0319 11:57:59.540258 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m287x\" (UniqueName: \"kubernetes.io/projected/24f71770-714e-4111-9188-ad8663c6baa7-kube-api-access-m287x\") pod \"machine-config-daemon-mgzld\" (UID: \"24f71770-714e-4111-9188-ad8663c6baa7\") " pod="openshift-machine-config-operator/machine-config-daemon-mgzld" Mar 19 11:57:59.551751 master-0 kubenswrapper[6932]: I0319 11:57:59.551674 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-mgzld" Mar 19 11:58:01.475217 master-0 kubenswrapper[6932]: I0319 11:58:01.475032 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t"] Mar 19 11:58:02.147756 master-0 kubenswrapper[6932]: I0319 11:58:02.143667 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 19 11:58:02.147756 master-0 kubenswrapper[6932]: I0319 11:58:02.145206 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 11:58:02.163321 master-0 kubenswrapper[6932]: I0319 11:58:02.163248 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 19 11:58:02.263232 master-0 kubenswrapper[6932]: I0319 11:58:02.263174 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/02e4c691-68ed-49f6-a8f6-c87579b65f07-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"02e4c691-68ed-49f6-a8f6-c87579b65f07\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 11:58:02.263502 master-0 kubenswrapper[6932]: I0319 11:58:02.263278 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/02e4c691-68ed-49f6-a8f6-c87579b65f07-kube-api-access\") pod \"installer-2-master-0\" (UID: \"02e4c691-68ed-49f6-a8f6-c87579b65f07\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 11:58:02.263502 master-0 kubenswrapper[6932]: I0319 11:58:02.263310 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/02e4c691-68ed-49f6-a8f6-c87579b65f07-var-lock\") pod \"installer-2-master-0\" (UID: \"02e4c691-68ed-49f6-a8f6-c87579b65f07\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 11:58:02.364571 master-0 kubenswrapper[6932]: I0319 11:58:02.364452 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/02e4c691-68ed-49f6-a8f6-c87579b65f07-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"02e4c691-68ed-49f6-a8f6-c87579b65f07\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 11:58:02.364891 master-0 kubenswrapper[6932]: I0319 11:58:02.364618 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/02e4c691-68ed-49f6-a8f6-c87579b65f07-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"02e4c691-68ed-49f6-a8f6-c87579b65f07\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 11:58:02.364891 master-0 kubenswrapper[6932]: I0319 11:58:02.364708 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/02e4c691-68ed-49f6-a8f6-c87579b65f07-kube-api-access\") pod \"installer-2-master-0\" (UID: \"02e4c691-68ed-49f6-a8f6-c87579b65f07\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 11:58:02.364891 master-0 kubenswrapper[6932]: I0319 11:58:02.364783 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/02e4c691-68ed-49f6-a8f6-c87579b65f07-var-lock\") pod \"installer-2-master-0\" (UID: \"02e4c691-68ed-49f6-a8f6-c87579b65f07\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 11:58:02.365036 master-0 kubenswrapper[6932]: I0319 11:58:02.364976 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/02e4c691-68ed-49f6-a8f6-c87579b65f07-var-lock\") pod \"installer-2-master-0\" (UID: \"02e4c691-68ed-49f6-a8f6-c87579b65f07\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 11:58:02.393793 master-0 kubenswrapper[6932]: I0319 11:58:02.393715 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/02e4c691-68ed-49f6-a8f6-c87579b65f07-kube-api-access\") pod \"installer-2-master-0\" (UID: \"02e4c691-68ed-49f6-a8f6-c87579b65f07\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 11:58:02.487194 master-0 kubenswrapper[6932]: I0319 11:58:02.487110 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 11:58:04.895855 master-0 kubenswrapper[6932]: I0319 11:58:04.895800 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-retry-1-master-0_76f8b2b8-4315-431b-a2b9-deab1bfc7884/installer/0.log" Mar 19 11:58:04.895855 master-0 kubenswrapper[6932]: I0319 11:58:04.895860 6932 generic.go:334] "Generic (PLEG): container finished" podID="76f8b2b8-4315-431b-a2b9-deab1bfc7884" containerID="182613f47c988603fa253da695bf14c0d843684239f6f10a5d0b78872f67dc68" exitCode=1 Mar 19 11:58:04.896778 master-0 kubenswrapper[6932]: I0319 11:58:04.895896 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"76f8b2b8-4315-431b-a2b9-deab1bfc7884","Type":"ContainerDied","Data":"182613f47c988603fa253da695bf14c0d843684239f6f10a5d0b78872f67dc68"} Mar 19 11:58:08.564808 master-0 kubenswrapper[6932]: I0319 11:58:08.562274 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ccbc5" Mar 19 11:58:09.137248 master-0 kubenswrapper[6932]: I0319 11:58:09.137195 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-retry-1-master-0_76f8b2b8-4315-431b-a2b9-deab1bfc7884/installer/0.log" Mar 19 11:58:09.137491 master-0 kubenswrapper[6932]: I0319 11:58:09.137282 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 11:58:09.148309 master-0 kubenswrapper[6932]: E0319 11:58:09.148255 6932 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbfd5667_f6f4_4c7c_92b2_ea4ecd0f0103.slice/crio-conmon-1e03cf34918a9df69167cf80628d7425b9668e84a411e5ec9a6953baa6d085c1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbfd5667_f6f4_4c7c_92b2_ea4ecd0f0103.slice/crio-1e03cf34918a9df69167cf80628d7425b9668e84a411e5ec9a6953baa6d085c1.scope\": RecentStats: unable to find data in memory cache]" Mar 19 11:58:09.281618 master-0 kubenswrapper[6932]: I0319 11:58:09.281512 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76f8b2b8-4315-431b-a2b9-deab1bfc7884-kubelet-dir\") pod \"76f8b2b8-4315-431b-a2b9-deab1bfc7884\" (UID: \"76f8b2b8-4315-431b-a2b9-deab1bfc7884\") " Mar 19 11:58:09.281618 master-0 kubenswrapper[6932]: I0319 11:58:09.281625 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/76f8b2b8-4315-431b-a2b9-deab1bfc7884-var-lock\") pod \"76f8b2b8-4315-431b-a2b9-deab1bfc7884\" (UID: \"76f8b2b8-4315-431b-a2b9-deab1bfc7884\") " Mar 19 11:58:09.282185 master-0 kubenswrapper[6932]: I0319 11:58:09.281636 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76f8b2b8-4315-431b-a2b9-deab1bfc7884-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "76f8b2b8-4315-431b-a2b9-deab1bfc7884" (UID: "76f8b2b8-4315-431b-a2b9-deab1bfc7884"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:58:09.282185 master-0 kubenswrapper[6932]: I0319 11:58:09.281676 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76f8b2b8-4315-431b-a2b9-deab1bfc7884-kube-api-access\") pod \"76f8b2b8-4315-431b-a2b9-deab1bfc7884\" (UID: \"76f8b2b8-4315-431b-a2b9-deab1bfc7884\") " Mar 19 11:58:09.282185 master-0 kubenswrapper[6932]: I0319 11:58:09.281706 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76f8b2b8-4315-431b-a2b9-deab1bfc7884-var-lock" (OuterVolumeSpecName: "var-lock") pod "76f8b2b8-4315-431b-a2b9-deab1bfc7884" (UID: "76f8b2b8-4315-431b-a2b9-deab1bfc7884"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:58:09.282360 master-0 kubenswrapper[6932]: I0319 11:58:09.282329 6932 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76f8b2b8-4315-431b-a2b9-deab1bfc7884-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:58:09.282360 master-0 kubenswrapper[6932]: I0319 11:58:09.282356 6932 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/76f8b2b8-4315-431b-a2b9-deab1bfc7884-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 11:58:09.284510 master-0 kubenswrapper[6932]: I0319 11:58:09.284474 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76f8b2b8-4315-431b-a2b9-deab1bfc7884-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "76f8b2b8-4315-431b-a2b9-deab1bfc7884" (UID: "76f8b2b8-4315-431b-a2b9-deab1bfc7884"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:58:09.384380 master-0 kubenswrapper[6932]: I0319 11:58:09.384210 6932 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76f8b2b8-4315-431b-a2b9-deab1bfc7884-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 11:58:09.933712 master-0 kubenswrapper[6932]: I0319 11:58:09.933656 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-retry-1-master-0_76f8b2b8-4315-431b-a2b9-deab1bfc7884/installer/0.log" Mar 19 11:58:09.935357 master-0 kubenswrapper[6932]: I0319 11:58:09.933737 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"76f8b2b8-4315-431b-a2b9-deab1bfc7884","Type":"ContainerDied","Data":"95429c69af4c1bb1223ac6239bd6b9c564ae0325597ea2510ccb844f342bde54"} Mar 19 11:58:09.935357 master-0 kubenswrapper[6932]: I0319 11:58:09.933784 6932 scope.go:117] "RemoveContainer" containerID="182613f47c988603fa253da695bf14c0d843684239f6f10a5d0b78872f67dc68" Mar 19 11:58:09.935357 master-0 kubenswrapper[6932]: I0319 11:58:09.933811 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 11:58:10.685752 master-0 kubenswrapper[6932]: I0319 11:58:10.683113 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 19 11:58:11.066964 master-0 kubenswrapper[6932]: I0319 11:58:11.066834 6932 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 19 11:58:11.878235 master-0 kubenswrapper[6932]: I0319 11:58:11.878170 6932 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76f8b2b8-4315-431b-a2b9-deab1bfc7884" path="/var/lib/kubelet/pods/76f8b2b8-4315-431b-a2b9-deab1bfc7884/volumes" Mar 19 11:58:13.550675 master-0 kubenswrapper[6932]: I0319 11:58:13.550347 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 19 11:58:13.997073 master-0 kubenswrapper[6932]: I0319 11:58:13.993465 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" event={"ID":"034cad93-a500-4c58-8d97-fa49866a0d5e","Type":"ContainerStarted","Data":"819d3c5a7ac9411860ae19e82081d6731757e22d45f920c88cd6385079ad4506"} Mar 19 11:58:14.003285 master-0 kubenswrapper[6932]: I0319 11:58:14.002846 6932 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc" podUID="f5c84321-1399-48d6-916f-38011af8fd94" containerName="kube-rbac-proxy" containerID="cri-o://3308325303b85ce942fe4ae8099f8a790debc1f12fb249d9487ac008a6d02e50" gracePeriod=30 Mar 19 11:58:14.003285 master-0 kubenswrapper[6932]: I0319 11:58:14.002970 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc" event={"ID":"f5c84321-1399-48d6-916f-38011af8fd94","Type":"ContainerStarted","Data":"53ff24b876416fb487e033a7a89a5cd62b0f28db333c74d897379e227e12e25c"} Mar 19 11:58:14.003285 master-0 kubenswrapper[6932]: I0319 11:58:14.003014 6932 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc" podUID="f5c84321-1399-48d6-916f-38011af8fd94" containerName="machine-approver-controller" containerID="cri-o://53ff24b876416fb487e033a7a89a5cd62b0f28db333c74d897379e227e12e25c" gracePeriod=30 Mar 19 11:58:14.027177 master-0 kubenswrapper[6932]: I0319 11:58:14.026173 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f" event={"ID":"ac09dba7-398c-4b0a-a415-edb73cb4cf30","Type":"ContainerStarted","Data":"adb042fefde119c6586ea62230e4fd60ae5798a4428c1ca389426a4a8064a96f"} Mar 19 11:58:14.031192 master-0 kubenswrapper[6932]: I0319 11:58:14.029622 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" podStartSLOduration=6.064508743 podStartE2EDuration="24.029594173s" podCreationTimestamp="2026-03-19 11:57:50 +0000 UTC" firstStartedPulling="2026-03-19 11:57:55.349294446 +0000 UTC m=+299.708354668" lastFinishedPulling="2026-03-19 11:58:13.314379876 +0000 UTC m=+317.673440098" observedRunningTime="2026-03-19 11:58:14.026416605 +0000 UTC m=+318.385476837" watchObservedRunningTime="2026-03-19 11:58:14.029594173 +0000 UTC m=+318.388654395" Mar 19 11:58:14.031192 master-0 kubenswrapper[6932]: I0319 11:58:14.030886 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"02e4c691-68ed-49f6-a8f6-c87579b65f07","Type":"ContainerStarted","Data":"823cd400f46cb5e679e7cac9ded1fe167be8f0e08c4ab31db4dd5e9aa924bcd5"} Mar 19 11:58:14.063486 master-0 kubenswrapper[6932]: I0319 11:58:14.062413 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-htdhf" event={"ID":"52bdf7cc-f07d-487e-937c-6567f194947e","Type":"ContainerStarted","Data":"6d3653ccd7b7a331a5505780608fc7be11da3a194e4ef0d40b8621a68ae0e383"} Mar 19 11:58:14.071369 master-0 kubenswrapper[6932]: I0319 11:58:14.071273 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f" podStartSLOduration=6.56149162 podStartE2EDuration="24.071240523s" podCreationTimestamp="2026-03-19 11:57:50 +0000 UTC" firstStartedPulling="2026-03-19 11:57:55.592687466 +0000 UTC m=+299.951747688" lastFinishedPulling="2026-03-19 11:58:13.102436369 +0000 UTC m=+317.461496591" observedRunningTime="2026-03-19 11:58:14.06366765 +0000 UTC m=+318.422727872" watchObservedRunningTime="2026-03-19 11:58:14.071240523 +0000 UTC m=+318.430300765" Mar 19 11:58:14.073116 master-0 kubenswrapper[6932]: I0319 11:58:14.071990 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2fqh" event={"ID":"cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103","Type":"ContainerStarted","Data":"7d14696827ce29791692c6d66a1bdd4a26e679b78f8bfa5cabeba4c04ca10841"} Mar 19 11:58:14.093186 master-0 kubenswrapper[6932]: I0319 11:58:14.093094 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" event={"ID":"92e401a4-ed2f-46f7-924b-329d7b313e6a","Type":"ContainerStarted","Data":"184142423d13ad0dada3f9fb80ba56b19112320340bc71f1f71f1258d5f3fe78"} Mar 19 11:58:14.093186 master-0 kubenswrapper[6932]: I0319 11:58:14.093163 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" event={"ID":"92e401a4-ed2f-46f7-924b-329d7b313e6a","Type":"ContainerStarted","Data":"46876a7e063d974c121cff378937380f72a9002e08dc430717d4d702ce311e44"} Mar 19 11:58:14.095181 master-0 kubenswrapper[6932]: I0319 11:58:14.095110 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc" podStartSLOduration=14.73437737 podStartE2EDuration="24.095093742s" podCreationTimestamp="2026-03-19 11:57:50 +0000 UTC" firstStartedPulling="2026-03-19 11:57:54.863621263 +0000 UTC m=+299.222681485" lastFinishedPulling="2026-03-19 11:58:04.224337635 +0000 UTC m=+308.583397857" observedRunningTime="2026-03-19 11:58:14.092334785 +0000 UTC m=+318.451395027" watchObservedRunningTime="2026-03-19 11:58:14.095093742 +0000 UTC m=+318.454153964" Mar 19 11:58:14.118323 master-0 kubenswrapper[6932]: I0319 11:58:14.113078 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgzld" event={"ID":"24f71770-714e-4111-9188-ad8663c6baa7","Type":"ContainerStarted","Data":"ed8670596dc2cf1c143e88b33c5db3a45b6bfb4638c8b31a2dcca3f77e463698"} Mar 19 11:58:14.118323 master-0 kubenswrapper[6932]: I0319 11:58:14.113138 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgzld" event={"ID":"24f71770-714e-4111-9188-ad8663c6baa7","Type":"ContainerStarted","Data":"38cb26629a14fdae9d7f35eac30d1706193c11f4823405b1ab890376e3178bdd"} Mar 19 11:58:14.148783 master-0 kubenswrapper[6932]: I0319 11:58:14.145742 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls" event={"ID":"0cbbe8d0-aafb-499f-a1f4-affcea62c1ab","Type":"ContainerStarted","Data":"b944a4edc225ffafd52e1cf5334f5ea1a260002ecbba73ca64aac1df4ae6f81e"} Mar 19 11:58:14.148783 master-0 kubenswrapper[6932]: I0319 11:58:14.148486 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-ssxxd" event={"ID":"bb22a965-9b36-40cd-993d-747a3978be8e","Type":"ContainerStarted","Data":"e8b4004249dd6c1857124824da21ab9935cdb27e1cbfddfbf815045f406df122"} Mar 19 11:58:14.183743 master-0 kubenswrapper[6932]: I0319 11:58:14.180564 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-htdhf" podStartSLOduration=6.141379598 podStartE2EDuration="24.180549517s" podCreationTimestamp="2026-03-19 11:57:50 +0000 UTC" firstStartedPulling="2026-03-19 11:57:55.062819529 +0000 UTC m=+299.421879751" lastFinishedPulling="2026-03-19 11:58:13.101989408 +0000 UTC m=+317.461049670" observedRunningTime="2026-03-19 11:58:14.133331401 +0000 UTC m=+318.492391623" watchObservedRunningTime="2026-03-19 11:58:14.180549517 +0000 UTC m=+318.539609739" Mar 19 11:58:14.183743 master-0 kubenswrapper[6932]: I0319 11:58:14.182412 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w2fqh" podStartSLOduration=3.642679107 podStartE2EDuration="36.182402983s" podCreationTimestamp="2026-03-19 11:57:38 +0000 UTC" firstStartedPulling="2026-03-19 11:57:40.562342364 +0000 UTC m=+284.921402586" lastFinishedPulling="2026-03-19 11:58:13.10206624 +0000 UTC m=+317.461126462" observedRunningTime="2026-03-19 11:58:14.180464575 +0000 UTC m=+318.539524797" watchObservedRunningTime="2026-03-19 11:58:14.182402983 +0000 UTC m=+318.541463205" Mar 19 11:58:14.212088 master-0 kubenswrapper[6932]: I0319 11:58:14.211167 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-6cb57bb5db-9fbqc_f5c84321-1399-48d6-916f-38011af8fd94/machine-approver-controller/0.log" Mar 19 11:58:14.217105 master-0 kubenswrapper[6932]: I0319 11:58:14.217071 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc" Mar 19 11:58:14.221664 master-0 kubenswrapper[6932]: I0319 11:58:14.221589 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-mgzld" podStartSLOduration=15.221566894 podStartE2EDuration="15.221566894s" podCreationTimestamp="2026-03-19 11:57:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:58:14.212282498 +0000 UTC m=+318.571342730" watchObservedRunningTime="2026-03-19 11:58:14.221566894 +0000 UTC m=+318.580627116" Mar 19 11:58:14.254520 master-0 kubenswrapper[6932]: I0319 11:58:14.253388 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls" podStartSLOduration=6.434757723 podStartE2EDuration="24.253368245s" podCreationTimestamp="2026-03-19 11:57:50 +0000 UTC" firstStartedPulling="2026-03-19 11:57:55.598391935 +0000 UTC m=+299.957452157" lastFinishedPulling="2026-03-19 11:58:13.417002457 +0000 UTC m=+317.776062679" observedRunningTime="2026-03-19 11:58:14.248943078 +0000 UTC m=+318.608003300" watchObservedRunningTime="2026-03-19 11:58:14.253368245 +0000 UTC m=+318.612428467" Mar 19 11:58:14.317230 master-0 kubenswrapper[6932]: I0319 11:58:14.310332 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" podStartSLOduration=10.063824822 podStartE2EDuration="24.310307798s" podCreationTimestamp="2026-03-19 11:57:50 +0000 UTC" firstStartedPulling="2026-03-19 11:57:54.849602922 +0000 UTC m=+299.208663144" lastFinishedPulling="2026-03-19 11:58:09.096085898 +0000 UTC m=+313.455146120" observedRunningTime="2026-03-19 11:58:14.305006129 +0000 UTC m=+318.664066351" watchObservedRunningTime="2026-03-19 11:58:14.310307798 +0000 UTC m=+318.669368020" Mar 19 11:58:14.400251 master-0 kubenswrapper[6932]: I0319 11:58:14.400178 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b4mx\" (UniqueName: \"kubernetes.io/projected/f5c84321-1399-48d6-916f-38011af8fd94-kube-api-access-2b4mx\") pod \"f5c84321-1399-48d6-916f-38011af8fd94\" (UID: \"f5c84321-1399-48d6-916f-38011af8fd94\") " Mar 19 11:58:14.400510 master-0 kubenswrapper[6932]: I0319 11:58:14.400324 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f5c84321-1399-48d6-916f-38011af8fd94-machine-approver-tls\") pod \"f5c84321-1399-48d6-916f-38011af8fd94\" (UID: \"f5c84321-1399-48d6-916f-38011af8fd94\") " Mar 19 11:58:14.400510 master-0 kubenswrapper[6932]: I0319 11:58:14.400390 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f5c84321-1399-48d6-916f-38011af8fd94-auth-proxy-config\") pod \"f5c84321-1399-48d6-916f-38011af8fd94\" (UID: \"f5c84321-1399-48d6-916f-38011af8fd94\") " Mar 19 11:58:14.401076 master-0 kubenswrapper[6932]: I0319 11:58:14.401019 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5c84321-1399-48d6-916f-38011af8fd94-config\") pod \"f5c84321-1399-48d6-916f-38011af8fd94\" (UID: \"f5c84321-1399-48d6-916f-38011af8fd94\") " Mar 19 11:58:14.401250 master-0 kubenswrapper[6932]: I0319 11:58:14.401164 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5c84321-1399-48d6-916f-38011af8fd94-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "f5c84321-1399-48d6-916f-38011af8fd94" (UID: "f5c84321-1399-48d6-916f-38011af8fd94"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:58:14.401623 master-0 kubenswrapper[6932]: I0319 11:58:14.401594 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5c84321-1399-48d6-916f-38011af8fd94-config" (OuterVolumeSpecName: "config") pod "f5c84321-1399-48d6-916f-38011af8fd94" (UID: "f5c84321-1399-48d6-916f-38011af8fd94"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:58:14.423743 master-0 kubenswrapper[6932]: I0319 11:58:14.421310 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5c84321-1399-48d6-916f-38011af8fd94-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "f5c84321-1399-48d6-916f-38011af8fd94" (UID: "f5c84321-1399-48d6-916f-38011af8fd94"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:58:14.435083 master-0 kubenswrapper[6932]: I0319 11:58:14.435011 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5c84321-1399-48d6-916f-38011af8fd94-kube-api-access-2b4mx" (OuterVolumeSpecName: "kube-api-access-2b4mx") pod "f5c84321-1399-48d6-916f-38011af8fd94" (UID: "f5c84321-1399-48d6-916f-38011af8fd94"). InnerVolumeSpecName "kube-api-access-2b4mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:58:14.503751 master-0 kubenswrapper[6932]: I0319 11:58:14.503006 6932 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f5c84321-1399-48d6-916f-38011af8fd94-machine-approver-tls\") on node \"master-0\" DevicePath \"\"" Mar 19 11:58:14.503751 master-0 kubenswrapper[6932]: I0319 11:58:14.503068 6932 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f5c84321-1399-48d6-916f-38011af8fd94-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:58:14.503751 master-0 kubenswrapper[6932]: I0319 11:58:14.503084 6932 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5c84321-1399-48d6-916f-38011af8fd94-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:58:14.503751 master-0 kubenswrapper[6932]: I0319 11:58:14.503098 6932 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b4mx\" (UniqueName: \"kubernetes.io/projected/f5c84321-1399-48d6-916f-38011af8fd94-kube-api-access-2b4mx\") on node \"master-0\" DevicePath \"\"" Mar 19 11:58:15.157476 master-0 kubenswrapper[6932]: I0319 11:58:15.157401 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-ssxxd" event={"ID":"bb22a965-9b36-40cd-993d-747a3978be8e","Type":"ContainerStarted","Data":"93c6bf16ea257f169760e54138a0274df0d338ad8d904eba474636285ea1b177"} Mar 19 11:58:15.159576 master-0 kubenswrapper[6932]: I0319 11:58:15.159525 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-6cb57bb5db-9fbqc_f5c84321-1399-48d6-916f-38011af8fd94/machine-approver-controller/0.log" Mar 19 11:58:15.160138 master-0 kubenswrapper[6932]: I0319 11:58:15.160097 6932 generic.go:334] "Generic (PLEG): container finished" podID="f5c84321-1399-48d6-916f-38011af8fd94" containerID="53ff24b876416fb487e033a7a89a5cd62b0f28db333c74d897379e227e12e25c" exitCode=2 Mar 19 11:58:15.160138 master-0 kubenswrapper[6932]: I0319 11:58:15.160134 6932 generic.go:334] "Generic (PLEG): container finished" podID="f5c84321-1399-48d6-916f-38011af8fd94" containerID="3308325303b85ce942fe4ae8099f8a790debc1f12fb249d9487ac008a6d02e50" exitCode=0 Mar 19 11:58:15.160248 master-0 kubenswrapper[6932]: I0319 11:58:15.160166 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc" event={"ID":"f5c84321-1399-48d6-916f-38011af8fd94","Type":"ContainerDied","Data":"53ff24b876416fb487e033a7a89a5cd62b0f28db333c74d897379e227e12e25c"} Mar 19 11:58:15.160248 master-0 kubenswrapper[6932]: I0319 11:58:15.160219 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc" Mar 19 11:58:15.160327 master-0 kubenswrapper[6932]: I0319 11:58:15.160255 6932 scope.go:117] "RemoveContainer" containerID="53ff24b876416fb487e033a7a89a5cd62b0f28db333c74d897379e227e12e25c" Mar 19 11:58:15.160582 master-0 kubenswrapper[6932]: I0319 11:58:15.160239 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc" event={"ID":"f5c84321-1399-48d6-916f-38011af8fd94","Type":"ContainerDied","Data":"3308325303b85ce942fe4ae8099f8a790debc1f12fb249d9487ac008a6d02e50"} Mar 19 11:58:15.160582 master-0 kubenswrapper[6932]: I0319 11:58:15.160576 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc" event={"ID":"f5c84321-1399-48d6-916f-38011af8fd94","Type":"ContainerDied","Data":"41fd122de6629955b52f32ce925718d564553bd270f16d2cdb7c67de534171c6"} Mar 19 11:58:15.166932 master-0 kubenswrapper[6932]: I0319 11:58:15.164087 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" event={"ID":"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c","Type":"ContainerStarted","Data":"cbe0659a905424cc618a0ecef6205e4a5461c4a5aa8693bd54b2478559a7a18c"} Mar 19 11:58:15.166932 master-0 kubenswrapper[6932]: I0319 11:58:15.164121 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" event={"ID":"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c","Type":"ContainerStarted","Data":"f3da7511f1427cbfb861a8a8ebd128e7a1eb37f162cf4bec0dc915a49830a358"} Mar 19 11:58:15.166932 master-0 kubenswrapper[6932]: I0319 11:58:15.164134 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" event={"ID":"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c","Type":"ContainerStarted","Data":"0d890693963df7296b70f0c7e51e810ca5564ddbd56f620d36d327f99163e076"} Mar 19 11:58:15.166932 master-0 kubenswrapper[6932]: I0319 11:58:15.164263 6932 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" podUID="66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c" containerName="cluster-cloud-controller-manager" containerID="cri-o://0d890693963df7296b70f0c7e51e810ca5564ddbd56f620d36d327f99163e076" gracePeriod=30 Mar 19 11:58:15.166932 master-0 kubenswrapper[6932]: I0319 11:58:15.164466 6932 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" podUID="66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c" containerName="kube-rbac-proxy" containerID="cri-o://cbe0659a905424cc618a0ecef6205e4a5461c4a5aa8693bd54b2478559a7a18c" gracePeriod=30 Mar 19 11:58:15.166932 master-0 kubenswrapper[6932]: I0319 11:58:15.164516 6932 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" podUID="66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c" containerName="config-sync-controllers" containerID="cri-o://f3da7511f1427cbfb861a8a8ebd128e7a1eb37f162cf4bec0dc915a49830a358" gracePeriod=30 Mar 19 11:58:15.182008 master-0 kubenswrapper[6932]: I0319 11:58:15.181956 6932 scope.go:117] "RemoveContainer" containerID="3308325303b85ce942fe4ae8099f8a790debc1f12fb249d9487ac008a6d02e50" Mar 19 11:58:15.183959 master-0 kubenswrapper[6932]: I0319 11:58:15.182810 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-ssxxd" podStartSLOduration=7.5217273559999995 podStartE2EDuration="25.182772113s" podCreationTimestamp="2026-03-19 11:57:50 +0000 UTC" firstStartedPulling="2026-03-19 11:57:55.582273073 +0000 UTC m=+299.941333295" lastFinishedPulling="2026-03-19 11:58:13.24331783 +0000 UTC m=+317.602378052" observedRunningTime="2026-03-19 11:58:15.182352683 +0000 UTC m=+319.541412905" watchObservedRunningTime="2026-03-19 11:58:15.182772113 +0000 UTC m=+319.541832335" Mar 19 11:58:15.187857 master-0 kubenswrapper[6932]: I0319 11:58:15.187497 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" event={"ID":"75aedbcd-f6ed-43a1-941b-4b04887ffe8e","Type":"ContainerStarted","Data":"3810b698a7664ecffbd70d9f8ffd6900645de0e326d8f4fd4544c2849ae84c43"} Mar 19 11:58:15.211826 master-0 kubenswrapper[6932]: I0319 11:58:15.211743 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"02e4c691-68ed-49f6-a8f6-c87579b65f07","Type":"ContainerStarted","Data":"6384c9ad839ac528dae2f57abae0c1588c98fa5a01e5dd526fb6a9b608bc80e3"} Mar 19 11:58:15.218622 master-0 kubenswrapper[6932]: I0319 11:58:15.217992 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgzld" event={"ID":"24f71770-714e-4111-9188-ad8663c6baa7","Type":"ContainerStarted","Data":"2e3c982087e82b9d23ad6ad4fd35e925eb9a2a833ffd5dd37457fcdc3dc142e2"} Mar 19 11:58:15.248258 master-0 kubenswrapper[6932]: I0319 11:58:15.248182 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc"] Mar 19 11:58:15.260755 master-0 kubenswrapper[6932]: I0319 11:58:15.259920 6932 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-6cb57bb5db-9fbqc"] Mar 19 11:58:15.260755 master-0 kubenswrapper[6932]: I0319 11:58:15.259949 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" podStartSLOduration=6.475563103 podStartE2EDuration="25.259918156s" podCreationTimestamp="2026-03-19 11:57:50 +0000 UTC" firstStartedPulling="2026-03-19 11:57:54.53077899 +0000 UTC m=+298.889839212" lastFinishedPulling="2026-03-19 11:58:13.315134043 +0000 UTC m=+317.674194265" observedRunningTime="2026-03-19 11:58:15.248312625 +0000 UTC m=+319.607372867" watchObservedRunningTime="2026-03-19 11:58:15.259918156 +0000 UTC m=+319.618978378" Mar 19 11:58:15.286753 master-0 kubenswrapper[6932]: I0319 11:58:15.281793 6932 scope.go:117] "RemoveContainer" containerID="53ff24b876416fb487e033a7a89a5cd62b0f28db333c74d897379e227e12e25c" Mar 19 11:58:15.312108 master-0 kubenswrapper[6932]: E0319 11:58:15.311030 6932 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53ff24b876416fb487e033a7a89a5cd62b0f28db333c74d897379e227e12e25c\": container with ID starting with 53ff24b876416fb487e033a7a89a5cd62b0f28db333c74d897379e227e12e25c not found: ID does not exist" containerID="53ff24b876416fb487e033a7a89a5cd62b0f28db333c74d897379e227e12e25c" Mar 19 11:58:15.312108 master-0 kubenswrapper[6932]: I0319 11:58:15.311183 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53ff24b876416fb487e033a7a89a5cd62b0f28db333c74d897379e227e12e25c"} err="failed to get container status \"53ff24b876416fb487e033a7a89a5cd62b0f28db333c74d897379e227e12e25c\": rpc error: code = NotFound desc = could not find container \"53ff24b876416fb487e033a7a89a5cd62b0f28db333c74d897379e227e12e25c\": container with ID starting with 53ff24b876416fb487e033a7a89a5cd62b0f28db333c74d897379e227e12e25c not found: ID does not exist" Mar 19 11:58:15.312108 master-0 kubenswrapper[6932]: I0319 11:58:15.311227 6932 scope.go:117] "RemoveContainer" containerID="3308325303b85ce942fe4ae8099f8a790debc1f12fb249d9487ac008a6d02e50" Mar 19 11:58:15.312108 master-0 kubenswrapper[6932]: E0319 11:58:15.311834 6932 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3308325303b85ce942fe4ae8099f8a790debc1f12fb249d9487ac008a6d02e50\": container with ID starting with 3308325303b85ce942fe4ae8099f8a790debc1f12fb249d9487ac008a6d02e50 not found: ID does not exist" containerID="3308325303b85ce942fe4ae8099f8a790debc1f12fb249d9487ac008a6d02e50" Mar 19 11:58:15.312108 master-0 kubenswrapper[6932]: I0319 11:58:15.311855 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3308325303b85ce942fe4ae8099f8a790debc1f12fb249d9487ac008a6d02e50"} err="failed to get container status \"3308325303b85ce942fe4ae8099f8a790debc1f12fb249d9487ac008a6d02e50\": rpc error: code = NotFound desc = could not find container \"3308325303b85ce942fe4ae8099f8a790debc1f12fb249d9487ac008a6d02e50\": container with ID starting with 3308325303b85ce942fe4ae8099f8a790debc1f12fb249d9487ac008a6d02e50 not found: ID does not exist" Mar 19 11:58:15.312108 master-0 kubenswrapper[6932]: I0319 11:58:15.311873 6932 scope.go:117] "RemoveContainer" containerID="53ff24b876416fb487e033a7a89a5cd62b0f28db333c74d897379e227e12e25c" Mar 19 11:58:15.312542 master-0 kubenswrapper[6932]: I0319 11:58:15.312314 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53ff24b876416fb487e033a7a89a5cd62b0f28db333c74d897379e227e12e25c"} err="failed to get container status \"53ff24b876416fb487e033a7a89a5cd62b0f28db333c74d897379e227e12e25c\": rpc error: code = NotFound desc = could not find container \"53ff24b876416fb487e033a7a89a5cd62b0f28db333c74d897379e227e12e25c\": container with ID starting with 53ff24b876416fb487e033a7a89a5cd62b0f28db333c74d897379e227e12e25c not found: ID does not exist" Mar 19 11:58:15.312542 master-0 kubenswrapper[6932]: I0319 11:58:15.312375 6932 scope.go:117] "RemoveContainer" containerID="3308325303b85ce942fe4ae8099f8a790debc1f12fb249d9487ac008a6d02e50" Mar 19 11:58:15.315937 master-0 kubenswrapper[6932]: I0319 11:58:15.315891 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3308325303b85ce942fe4ae8099f8a790debc1f12fb249d9487ac008a6d02e50"} err="failed to get container status \"3308325303b85ce942fe4ae8099f8a790debc1f12fb249d9487ac008a6d02e50\": rpc error: code = NotFound desc = could not find container \"3308325303b85ce942fe4ae8099f8a790debc1f12fb249d9487ac008a6d02e50\": container with ID starting with 3308325303b85ce942fe4ae8099f8a790debc1f12fb249d9487ac008a6d02e50 not found: ID does not exist" Mar 19 11:58:15.322129 master-0 kubenswrapper[6932]: I0319 11:58:15.321254 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5"] Mar 19 11:58:15.322129 master-0 kubenswrapper[6932]: E0319 11:58:15.321819 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76f8b2b8-4315-431b-a2b9-deab1bfc7884" containerName="installer" Mar 19 11:58:15.322129 master-0 kubenswrapper[6932]: I0319 11:58:15.321839 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="76f8b2b8-4315-431b-a2b9-deab1bfc7884" containerName="installer" Mar 19 11:58:15.322129 master-0 kubenswrapper[6932]: E0319 11:58:15.321874 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5c84321-1399-48d6-916f-38011af8fd94" containerName="kube-rbac-proxy" Mar 19 11:58:15.322129 master-0 kubenswrapper[6932]: I0319 11:58:15.321884 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c84321-1399-48d6-916f-38011af8fd94" containerName="kube-rbac-proxy" Mar 19 11:58:15.322129 master-0 kubenswrapper[6932]: E0319 11:58:15.321908 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5c84321-1399-48d6-916f-38011af8fd94" containerName="machine-approver-controller" Mar 19 11:58:15.322129 master-0 kubenswrapper[6932]: I0319 11:58:15.321918 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c84321-1399-48d6-916f-38011af8fd94" containerName="machine-approver-controller" Mar 19 11:58:15.322427 master-0 kubenswrapper[6932]: I0319 11:58:15.322149 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="76f8b2b8-4315-431b-a2b9-deab1bfc7884" containerName="installer" Mar 19 11:58:15.322427 master-0 kubenswrapper[6932]: I0319 11:58:15.322174 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5c84321-1399-48d6-916f-38011af8fd94" containerName="kube-rbac-proxy" Mar 19 11:58:15.322427 master-0 kubenswrapper[6932]: I0319 11:58:15.322194 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5c84321-1399-48d6-916f-38011af8fd94" containerName="machine-approver-controller" Mar 19 11:58:15.326995 master-0 kubenswrapper[6932]: I0319 11:58:15.323508 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" Mar 19 11:58:15.330767 master-0 kubenswrapper[6932]: I0319 11:58:15.329556 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 11:58:15.330767 master-0 kubenswrapper[6932]: I0319 11:58:15.329610 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 11:58:15.330767 master-0 kubenswrapper[6932]: I0319 11:58:15.329712 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 11:58:15.330767 master-0 kubenswrapper[6932]: I0319 11:58:15.329839 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 11:58:15.330767 master-0 kubenswrapper[6932]: I0319 11:58:15.330044 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 11:58:15.330767 master-0 kubenswrapper[6932]: I0319 11:58:15.330378 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" podStartSLOduration=7.223391862 podStartE2EDuration="25.330355437s" podCreationTimestamp="2026-03-19 11:57:50 +0000 UTC" firstStartedPulling="2026-03-19 11:57:55.219392731 +0000 UTC m=+299.578452953" lastFinishedPulling="2026-03-19 11:58:13.326356306 +0000 UTC m=+317.685416528" observedRunningTime="2026-03-19 11:58:15.271536269 +0000 UTC m=+319.630596511" watchObservedRunningTime="2026-03-19 11:58:15.330355437 +0000 UTC m=+319.689415659" Mar 19 11:58:15.332793 master-0 kubenswrapper[6932]: I0319 11:58:15.332767 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-2l456" Mar 19 11:58:15.334256 master-0 kubenswrapper[6932]: I0319 11:58:15.334205 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-2-master-0" podStartSLOduration=13.33419039 podStartE2EDuration="13.33419039s" podCreationTimestamp="2026-03-19 11:58:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:58:15.311775705 +0000 UTC m=+319.670835927" watchObservedRunningTime="2026-03-19 11:58:15.33419039 +0000 UTC m=+319.693250612" Mar 19 11:58:15.386918 master-0 kubenswrapper[6932]: I0319 11:58:15.386879 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" Mar 19 11:58:15.429427 master-0 kubenswrapper[6932]: I0319 11:58:15.428648 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr788\" (UniqueName: \"kubernetes.io/projected/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-kube-api-access-dr788\") pod \"machine-approver-5c6485487f-5zvc5\" (UID: \"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" Mar 19 11:58:15.429427 master-0 kubenswrapper[6932]: I0319 11:58:15.429165 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-machine-approver-tls\") pod \"machine-approver-5c6485487f-5zvc5\" (UID: \"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" Mar 19 11:58:15.429427 master-0 kubenswrapper[6932]: I0319 11:58:15.429240 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-config\") pod \"machine-approver-5c6485487f-5zvc5\" (UID: \"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" Mar 19 11:58:15.429427 master-0 kubenswrapper[6932]: I0319 11:58:15.429374 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-auth-proxy-config\") pod \"machine-approver-5c6485487f-5zvc5\" (UID: \"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" Mar 19 11:58:15.530714 master-0 kubenswrapper[6932]: I0319 11:58:15.530629 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-host-etc-kube\") pod \"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c\" (UID: \"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c\") " Mar 19 11:58:15.530714 master-0 kubenswrapper[6932]: I0319 11:58:15.530702 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-images\") pod \"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c\" (UID: \"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c\") " Mar 19 11:58:15.530714 master-0 kubenswrapper[6932]: I0319 11:58:15.530741 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-cloud-controller-manager-operator-tls\") pod \"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c\" (UID: \"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c\") " Mar 19 11:58:15.531122 master-0 kubenswrapper[6932]: I0319 11:58:15.530767 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-auth-proxy-config\") pod \"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c\" (UID: \"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c\") " Mar 19 11:58:15.531122 master-0 kubenswrapper[6932]: I0319 11:58:15.530805 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tb9p\" (UniqueName: \"kubernetes.io/projected/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-kube-api-access-5tb9p\") pod \"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c\" (UID: \"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c\") " Mar 19 11:58:15.531122 master-0 kubenswrapper[6932]: I0319 11:58:15.531032 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr788\" (UniqueName: \"kubernetes.io/projected/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-kube-api-access-dr788\") pod \"machine-approver-5c6485487f-5zvc5\" (UID: \"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" Mar 19 11:58:15.531122 master-0 kubenswrapper[6932]: I0319 11:58:15.531097 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-machine-approver-tls\") pod \"machine-approver-5c6485487f-5zvc5\" (UID: \"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" Mar 19 11:58:15.531122 master-0 kubenswrapper[6932]: I0319 11:58:15.531114 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-config\") pod \"machine-approver-5c6485487f-5zvc5\" (UID: \"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" Mar 19 11:58:15.531274 master-0 kubenswrapper[6932]: I0319 11:58:15.531132 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-auth-proxy-config\") pod \"machine-approver-5c6485487f-5zvc5\" (UID: \"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" Mar 19 11:58:15.531912 master-0 kubenswrapper[6932]: I0319 11:58:15.531877 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-auth-proxy-config\") pod \"machine-approver-5c6485487f-5zvc5\" (UID: \"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" Mar 19 11:58:15.532385 master-0 kubenswrapper[6932]: I0319 11:58:15.532318 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c" (UID: "66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:58:15.532427 master-0 kubenswrapper[6932]: I0319 11:58:15.532415 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-host-etc-kube" (OuterVolumeSpecName: "host-etc-kube") pod "66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c" (UID: "66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c"). InnerVolumeSpecName "host-etc-kube". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:58:15.532700 master-0 kubenswrapper[6932]: I0319 11:58:15.532666 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-images" (OuterVolumeSpecName: "images") pod "66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c" (UID: "66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:58:15.533291 master-0 kubenswrapper[6932]: I0319 11:58:15.533242 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-config\") pod \"machine-approver-5c6485487f-5zvc5\" (UID: \"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" Mar 19 11:58:15.535276 master-0 kubenswrapper[6932]: I0319 11:58:15.535225 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-machine-approver-tls\") pod \"machine-approver-5c6485487f-5zvc5\" (UID: \"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" Mar 19 11:58:15.537139 master-0 kubenswrapper[6932]: I0319 11:58:15.537069 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-kube-api-access-5tb9p" (OuterVolumeSpecName: "kube-api-access-5tb9p") pod "66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c" (UID: "66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c"). InnerVolumeSpecName "kube-api-access-5tb9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:58:15.541005 master-0 kubenswrapper[6932]: I0319 11:58:15.540951 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-cloud-controller-manager-operator-tls" (OuterVolumeSpecName: "cloud-controller-manager-operator-tls") pod "66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c" (UID: "66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c"). InnerVolumeSpecName "cloud-controller-manager-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:58:15.552029 master-0 kubenswrapper[6932]: I0319 11:58:15.551986 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr788\" (UniqueName: \"kubernetes.io/projected/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-kube-api-access-dr788\") pod \"machine-approver-5c6485487f-5zvc5\" (UID: \"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" Mar 19 11:58:15.632310 master-0 kubenswrapper[6932]: I0319 11:58:15.632242 6932 reconciler_common.go:293] "Volume detached for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-host-etc-kube\") on node \"master-0\" DevicePath \"\"" Mar 19 11:58:15.632310 master-0 kubenswrapper[6932]: I0319 11:58:15.632288 6932 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-images\") on node \"master-0\" DevicePath \"\"" Mar 19 11:58:15.632310 master-0 kubenswrapper[6932]: I0319 11:58:15.632299 6932 reconciler_common.go:293] "Volume detached for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-cloud-controller-manager-operator-tls\") on node \"master-0\" DevicePath \"\"" Mar 19 11:58:15.632310 master-0 kubenswrapper[6932]: I0319 11:58:15.632308 6932 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:58:15.632310 master-0 kubenswrapper[6932]: I0319 11:58:15.632319 6932 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tb9p\" (UniqueName: \"kubernetes.io/projected/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c-kube-api-access-5tb9p\") on node \"master-0\" DevicePath \"\"" Mar 19 11:58:15.683113 master-0 kubenswrapper[6932]: I0319 11:58:15.682981 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" Mar 19 11:58:15.698629 master-0 kubenswrapper[6932]: W0319 11:58:15.698571 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda198a18a_4dd8_4c35_b15e_2ed8acfe9bbc.slice/crio-fadafce6827750abea7ff3c06bd1da6d8ccd788c149ff361b207b48ae0bcefb8 WatchSource:0}: Error finding container fadafce6827750abea7ff3c06bd1da6d8ccd788c149ff361b207b48ae0bcefb8: Status 404 returned error can't find the container with id fadafce6827750abea7ff3c06bd1da6d8ccd788c149ff361b207b48ae0bcefb8 Mar 19 11:58:15.880672 master-0 kubenswrapper[6932]: I0319 11:58:15.880621 6932 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5c84321-1399-48d6-916f-38011af8fd94" path="/var/lib/kubelet/pods/f5c84321-1399-48d6-916f-38011af8fd94/volumes" Mar 19 11:58:16.228968 master-0 kubenswrapper[6932]: I0319 11:58:16.228892 6932 generic.go:334] "Generic (PLEG): container finished" podID="66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c" containerID="cbe0659a905424cc618a0ecef6205e4a5461c4a5aa8693bd54b2478559a7a18c" exitCode=0 Mar 19 11:58:16.228968 master-0 kubenswrapper[6932]: I0319 11:58:16.228940 6932 generic.go:334] "Generic (PLEG): container finished" podID="66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c" containerID="f3da7511f1427cbfb861a8a8ebd128e7a1eb37f162cf4bec0dc915a49830a358" exitCode=0 Mar 19 11:58:16.228968 master-0 kubenswrapper[6932]: I0319 11:58:16.228948 6932 generic.go:334] "Generic (PLEG): container finished" podID="66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c" containerID="0d890693963df7296b70f0c7e51e810ca5564ddbd56f620d36d327f99163e076" exitCode=0 Mar 19 11:58:16.229533 master-0 kubenswrapper[6932]: I0319 11:58:16.229001 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" Mar 19 11:58:16.229533 master-0 kubenswrapper[6932]: I0319 11:58:16.229001 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" event={"ID":"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c","Type":"ContainerDied","Data":"cbe0659a905424cc618a0ecef6205e4a5461c4a5aa8693bd54b2478559a7a18c"} Mar 19 11:58:16.229533 master-0 kubenswrapper[6932]: I0319 11:58:16.229215 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" event={"ID":"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c","Type":"ContainerDied","Data":"f3da7511f1427cbfb861a8a8ebd128e7a1eb37f162cf4bec0dc915a49830a358"} Mar 19 11:58:16.229533 master-0 kubenswrapper[6932]: I0319 11:58:16.229245 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" event={"ID":"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c","Type":"ContainerDied","Data":"0d890693963df7296b70f0c7e51e810ca5564ddbd56f620d36d327f99163e076"} Mar 19 11:58:16.229533 master-0 kubenswrapper[6932]: I0319 11:58:16.229259 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t" event={"ID":"66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c","Type":"ContainerDied","Data":"87604a2e751d08b3e86775f42a82bbd7312bc40226d08e5db1020a8212eac309"} Mar 19 11:58:16.229533 master-0 kubenswrapper[6932]: I0319 11:58:16.229289 6932 scope.go:117] "RemoveContainer" containerID="cbe0659a905424cc618a0ecef6205e4a5461c4a5aa8693bd54b2478559a7a18c" Mar 19 11:58:16.235824 master-0 kubenswrapper[6932]: I0319 11:58:16.235764 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" event={"ID":"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc","Type":"ContainerStarted","Data":"5a48c0f16a923ed0b10bf8df2ccd8ed50c32745daa2a915ec8165d2602a44666"} Mar 19 11:58:16.235894 master-0 kubenswrapper[6932]: I0319 11:58:16.235832 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" event={"ID":"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc","Type":"ContainerStarted","Data":"3abc6256b750afb46322fef3797ff81ed907cc609ef4140cb0b3c6ad37c1b6c5"} Mar 19 11:58:16.235894 master-0 kubenswrapper[6932]: I0319 11:58:16.235855 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" event={"ID":"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc","Type":"ContainerStarted","Data":"fadafce6827750abea7ff3c06bd1da6d8ccd788c149ff361b207b48ae0bcefb8"} Mar 19 11:58:16.246105 master-0 kubenswrapper[6932]: I0319 11:58:16.245422 6932 scope.go:117] "RemoveContainer" containerID="f3da7511f1427cbfb861a8a8ebd128e7a1eb37f162cf4bec0dc915a49830a358" Mar 19 11:58:16.253758 master-0 kubenswrapper[6932]: I0319 11:58:16.252421 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t"] Mar 19 11:58:16.258846 master-0 kubenswrapper[6932]: I0319 11:58:16.258785 6932 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-wr65t"] Mar 19 11:58:16.266525 master-0 kubenswrapper[6932]: I0319 11:58:16.266484 6932 scope.go:117] "RemoveContainer" containerID="0d890693963df7296b70f0c7e51e810ca5564ddbd56f620d36d327f99163e076" Mar 19 11:58:16.281769 master-0 kubenswrapper[6932]: I0319 11:58:16.280928 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" podStartSLOduration=1.2809030670000001 podStartE2EDuration="1.280903067s" podCreationTimestamp="2026-03-19 11:58:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:58:16.279020522 +0000 UTC m=+320.638080754" watchObservedRunningTime="2026-03-19 11:58:16.280903067 +0000 UTC m=+320.639963289" Mar 19 11:58:16.286949 master-0 kubenswrapper[6932]: I0319 11:58:16.286890 6932 scope.go:117] "RemoveContainer" containerID="cbe0659a905424cc618a0ecef6205e4a5461c4a5aa8693bd54b2478559a7a18c" Mar 19 11:58:16.287531 master-0 kubenswrapper[6932]: E0319 11:58:16.287491 6932 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbe0659a905424cc618a0ecef6205e4a5461c4a5aa8693bd54b2478559a7a18c\": container with ID starting with cbe0659a905424cc618a0ecef6205e4a5461c4a5aa8693bd54b2478559a7a18c not found: ID does not exist" containerID="cbe0659a905424cc618a0ecef6205e4a5461c4a5aa8693bd54b2478559a7a18c" Mar 19 11:58:16.287592 master-0 kubenswrapper[6932]: I0319 11:58:16.287532 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbe0659a905424cc618a0ecef6205e4a5461c4a5aa8693bd54b2478559a7a18c"} err="failed to get container status \"cbe0659a905424cc618a0ecef6205e4a5461c4a5aa8693bd54b2478559a7a18c\": rpc error: code = NotFound desc = could not find container \"cbe0659a905424cc618a0ecef6205e4a5461c4a5aa8693bd54b2478559a7a18c\": container with ID starting with cbe0659a905424cc618a0ecef6205e4a5461c4a5aa8693bd54b2478559a7a18c not found: ID does not exist" Mar 19 11:58:16.287592 master-0 kubenswrapper[6932]: I0319 11:58:16.287564 6932 scope.go:117] "RemoveContainer" containerID="f3da7511f1427cbfb861a8a8ebd128e7a1eb37f162cf4bec0dc915a49830a358" Mar 19 11:58:16.288040 master-0 kubenswrapper[6932]: E0319 11:58:16.287990 6932 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3da7511f1427cbfb861a8a8ebd128e7a1eb37f162cf4bec0dc915a49830a358\": container with ID starting with f3da7511f1427cbfb861a8a8ebd128e7a1eb37f162cf4bec0dc915a49830a358 not found: ID does not exist" containerID="f3da7511f1427cbfb861a8a8ebd128e7a1eb37f162cf4bec0dc915a49830a358" Mar 19 11:58:16.288040 master-0 kubenswrapper[6932]: I0319 11:58:16.288015 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3da7511f1427cbfb861a8a8ebd128e7a1eb37f162cf4bec0dc915a49830a358"} err="failed to get container status \"f3da7511f1427cbfb861a8a8ebd128e7a1eb37f162cf4bec0dc915a49830a358\": rpc error: code = NotFound desc = could not find container \"f3da7511f1427cbfb861a8a8ebd128e7a1eb37f162cf4bec0dc915a49830a358\": container with ID starting with f3da7511f1427cbfb861a8a8ebd128e7a1eb37f162cf4bec0dc915a49830a358 not found: ID does not exist" Mar 19 11:58:16.288040 master-0 kubenswrapper[6932]: I0319 11:58:16.288030 6932 scope.go:117] "RemoveContainer" containerID="0d890693963df7296b70f0c7e51e810ca5564ddbd56f620d36d327f99163e076" Mar 19 11:58:16.288359 master-0 kubenswrapper[6932]: E0319 11:58:16.288312 6932 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d890693963df7296b70f0c7e51e810ca5564ddbd56f620d36d327f99163e076\": container with ID starting with 0d890693963df7296b70f0c7e51e810ca5564ddbd56f620d36d327f99163e076 not found: ID does not exist" containerID="0d890693963df7296b70f0c7e51e810ca5564ddbd56f620d36d327f99163e076" Mar 19 11:58:16.288359 master-0 kubenswrapper[6932]: I0319 11:58:16.288342 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d890693963df7296b70f0c7e51e810ca5564ddbd56f620d36d327f99163e076"} err="failed to get container status \"0d890693963df7296b70f0c7e51e810ca5564ddbd56f620d36d327f99163e076\": rpc error: code = NotFound desc = could not find container \"0d890693963df7296b70f0c7e51e810ca5564ddbd56f620d36d327f99163e076\": container with ID starting with 0d890693963df7296b70f0c7e51e810ca5564ddbd56f620d36d327f99163e076 not found: ID does not exist" Mar 19 11:58:16.288359 master-0 kubenswrapper[6932]: I0319 11:58:16.288356 6932 scope.go:117] "RemoveContainer" containerID="cbe0659a905424cc618a0ecef6205e4a5461c4a5aa8693bd54b2478559a7a18c" Mar 19 11:58:16.288899 master-0 kubenswrapper[6932]: I0319 11:58:16.288593 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbe0659a905424cc618a0ecef6205e4a5461c4a5aa8693bd54b2478559a7a18c"} err="failed to get container status \"cbe0659a905424cc618a0ecef6205e4a5461c4a5aa8693bd54b2478559a7a18c\": rpc error: code = NotFound desc = could not find container \"cbe0659a905424cc618a0ecef6205e4a5461c4a5aa8693bd54b2478559a7a18c\": container with ID starting with cbe0659a905424cc618a0ecef6205e4a5461c4a5aa8693bd54b2478559a7a18c not found: ID does not exist" Mar 19 11:58:16.288899 master-0 kubenswrapper[6932]: I0319 11:58:16.288624 6932 scope.go:117] "RemoveContainer" containerID="f3da7511f1427cbfb861a8a8ebd128e7a1eb37f162cf4bec0dc915a49830a358" Mar 19 11:58:16.289150 master-0 kubenswrapper[6932]: I0319 11:58:16.289109 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3da7511f1427cbfb861a8a8ebd128e7a1eb37f162cf4bec0dc915a49830a358"} err="failed to get container status \"f3da7511f1427cbfb861a8a8ebd128e7a1eb37f162cf4bec0dc915a49830a358\": rpc error: code = NotFound desc = could not find container \"f3da7511f1427cbfb861a8a8ebd128e7a1eb37f162cf4bec0dc915a49830a358\": container with ID starting with f3da7511f1427cbfb861a8a8ebd128e7a1eb37f162cf4bec0dc915a49830a358 not found: ID does not exist" Mar 19 11:58:16.289150 master-0 kubenswrapper[6932]: I0319 11:58:16.289134 6932 scope.go:117] "RemoveContainer" containerID="0d890693963df7296b70f0c7e51e810ca5564ddbd56f620d36d327f99163e076" Mar 19 11:58:16.293779 master-0 kubenswrapper[6932]: I0319 11:58:16.292985 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d890693963df7296b70f0c7e51e810ca5564ddbd56f620d36d327f99163e076"} err="failed to get container status \"0d890693963df7296b70f0c7e51e810ca5564ddbd56f620d36d327f99163e076\": rpc error: code = NotFound desc = could not find container \"0d890693963df7296b70f0c7e51e810ca5564ddbd56f620d36d327f99163e076\": container with ID starting with 0d890693963df7296b70f0c7e51e810ca5564ddbd56f620d36d327f99163e076 not found: ID does not exist" Mar 19 11:58:16.293779 master-0 kubenswrapper[6932]: I0319 11:58:16.293044 6932 scope.go:117] "RemoveContainer" containerID="cbe0659a905424cc618a0ecef6205e4a5461c4a5aa8693bd54b2478559a7a18c" Mar 19 11:58:16.303240 master-0 kubenswrapper[6932]: I0319 11:58:16.303154 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbe0659a905424cc618a0ecef6205e4a5461c4a5aa8693bd54b2478559a7a18c"} err="failed to get container status \"cbe0659a905424cc618a0ecef6205e4a5461c4a5aa8693bd54b2478559a7a18c\": rpc error: code = NotFound desc = could not find container \"cbe0659a905424cc618a0ecef6205e4a5461c4a5aa8693bd54b2478559a7a18c\": container with ID starting with cbe0659a905424cc618a0ecef6205e4a5461c4a5aa8693bd54b2478559a7a18c not found: ID does not exist" Mar 19 11:58:16.303240 master-0 kubenswrapper[6932]: I0319 11:58:16.303229 6932 scope.go:117] "RemoveContainer" containerID="f3da7511f1427cbfb861a8a8ebd128e7a1eb37f162cf4bec0dc915a49830a358" Mar 19 11:58:16.305850 master-0 kubenswrapper[6932]: I0319 11:58:16.305798 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3da7511f1427cbfb861a8a8ebd128e7a1eb37f162cf4bec0dc915a49830a358"} err="failed to get container status \"f3da7511f1427cbfb861a8a8ebd128e7a1eb37f162cf4bec0dc915a49830a358\": rpc error: code = NotFound desc = could not find container \"f3da7511f1427cbfb861a8a8ebd128e7a1eb37f162cf4bec0dc915a49830a358\": container with ID starting with f3da7511f1427cbfb861a8a8ebd128e7a1eb37f162cf4bec0dc915a49830a358 not found: ID does not exist" Mar 19 11:58:16.305850 master-0 kubenswrapper[6932]: I0319 11:58:16.305838 6932 scope.go:117] "RemoveContainer" containerID="0d890693963df7296b70f0c7e51e810ca5564ddbd56f620d36d327f99163e076" Mar 19 11:58:16.311156 master-0 kubenswrapper[6932]: I0319 11:58:16.311094 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d890693963df7296b70f0c7e51e810ca5564ddbd56f620d36d327f99163e076"} err="failed to get container status \"0d890693963df7296b70f0c7e51e810ca5564ddbd56f620d36d327f99163e076\": rpc error: code = NotFound desc = could not find container \"0d890693963df7296b70f0c7e51e810ca5564ddbd56f620d36d327f99163e076\": container with ID starting with 0d890693963df7296b70f0c7e51e810ca5564ddbd56f620d36d327f99163e076 not found: ID does not exist" Mar 19 11:58:16.311363 master-0 kubenswrapper[6932]: I0319 11:58:16.311333 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86"] Mar 19 11:58:16.311657 master-0 kubenswrapper[6932]: E0319 11:58:16.311631 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c" containerName="kube-rbac-proxy" Mar 19 11:58:16.311657 master-0 kubenswrapper[6932]: I0319 11:58:16.311653 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c" containerName="kube-rbac-proxy" Mar 19 11:58:16.311739 master-0 kubenswrapper[6932]: E0319 11:58:16.311680 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c" containerName="config-sync-controllers" Mar 19 11:58:16.311739 master-0 kubenswrapper[6932]: I0319 11:58:16.311691 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c" containerName="config-sync-controllers" Mar 19 11:58:16.311739 master-0 kubenswrapper[6932]: E0319 11:58:16.311717 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c" containerName="cluster-cloud-controller-manager" Mar 19 11:58:16.311739 master-0 kubenswrapper[6932]: I0319 11:58:16.311741 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c" containerName="cluster-cloud-controller-manager" Mar 19 11:58:16.311873 master-0 kubenswrapper[6932]: I0319 11:58:16.311856 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c" containerName="kube-rbac-proxy" Mar 19 11:58:16.311873 master-0 kubenswrapper[6932]: I0319 11:58:16.311868 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c" containerName="config-sync-controllers" Mar 19 11:58:16.311926 master-0 kubenswrapper[6932]: I0319 11:58:16.311883 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c" containerName="cluster-cloud-controller-manager" Mar 19 11:58:16.312804 master-0 kubenswrapper[6932]: I0319 11:58:16.312771 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:58:16.320764 master-0 kubenswrapper[6932]: I0319 11:58:16.316796 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 19 11:58:16.320764 master-0 kubenswrapper[6932]: I0319 11:58:16.316897 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 19 11:58:16.320764 master-0 kubenswrapper[6932]: I0319 11:58:16.317868 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 19 11:58:16.320764 master-0 kubenswrapper[6932]: I0319 11:58:16.318015 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 19 11:58:16.320764 master-0 kubenswrapper[6932]: I0319 11:58:16.318142 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 11:58:16.320764 master-0 kubenswrapper[6932]: I0319 11:58:16.319057 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-kn6lc" Mar 19 11:58:16.442778 master-0 kubenswrapper[6932]: I0319 11:58:16.442707 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:58:16.442778 master-0 kubenswrapper[6932]: I0319 11:58:16.442776 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:58:16.443069 master-0 kubenswrapper[6932]: I0319 11:58:16.443014 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5jsb\" (UniqueName: \"kubernetes.io/projected/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-kube-api-access-p5jsb\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:58:16.443157 master-0 kubenswrapper[6932]: I0319 11:58:16.443118 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:58:16.443247 master-0 kubenswrapper[6932]: I0319 11:58:16.443217 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-images\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:58:16.545286 master-0 kubenswrapper[6932]: I0319 11:58:16.544145 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:58:16.545286 master-0 kubenswrapper[6932]: I0319 11:58:16.544219 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5jsb\" (UniqueName: \"kubernetes.io/projected/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-kube-api-access-p5jsb\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:58:16.545286 master-0 kubenswrapper[6932]: I0319 11:58:16.544249 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:58:16.545286 master-0 kubenswrapper[6932]: I0319 11:58:16.544712 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:58:16.545286 master-0 kubenswrapper[6932]: I0319 11:58:16.544795 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-images\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:58:16.545286 master-0 kubenswrapper[6932]: I0319 11:58:16.544840 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:58:16.545664 master-0 kubenswrapper[6932]: I0319 11:58:16.545630 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-images\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:58:16.545869 master-0 kubenswrapper[6932]: I0319 11:58:16.545830 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:58:16.547801 master-0 kubenswrapper[6932]: I0319 11:58:16.547756 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:58:16.567755 master-0 kubenswrapper[6932]: I0319 11:58:16.567669 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5jsb\" (UniqueName: \"kubernetes.io/projected/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-kube-api-access-p5jsb\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:58:16.645950 master-0 kubenswrapper[6932]: I0319 11:58:16.645891 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:58:17.246144 master-0 kubenswrapper[6932]: I0319 11:58:17.246088 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" event={"ID":"c8d8a09f-22d5-4f16-84d6-d5f2c504c949","Type":"ContainerStarted","Data":"6a8ee95c82e6b677420027c38c1c68131911d17ea065c53213f6254b809ed080"} Mar 19 11:58:17.246144 master-0 kubenswrapper[6932]: I0319 11:58:17.246128 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" event={"ID":"c8d8a09f-22d5-4f16-84d6-d5f2c504c949","Type":"ContainerStarted","Data":"934c2b00a5a26c98555551f393b5ebfacf94aab19ba7eb619ef808c070c8dab1"} Mar 19 11:58:17.246144 master-0 kubenswrapper[6932]: I0319 11:58:17.246139 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" event={"ID":"c8d8a09f-22d5-4f16-84d6-d5f2c504c949","Type":"ContainerStarted","Data":"36816d6fc70a9260d540c9487629bb4d582fa5330a4c11074ee3f05c1e9cbe38"} Mar 19 11:58:17.865265 master-0 kubenswrapper[6932]: I0319 11:58:17.865201 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9"] Mar 19 11:58:17.866529 master-0 kubenswrapper[6932]: I0319 11:58:17.866498 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9" Mar 19 11:58:17.871703 master-0 kubenswrapper[6932]: I0319 11:58:17.871650 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-7qzrj" Mar 19 11:58:17.871703 master-0 kubenswrapper[6932]: I0319 11:58:17.871668 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 11:58:17.886628 master-0 kubenswrapper[6932]: I0319 11:58:17.886565 6932 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c" path="/var/lib/kubelet/pods/66e557c9-f9bb-4eb8-bdaa-d61b62a4cf8c/volumes" Mar 19 11:58:17.889146 master-0 kubenswrapper[6932]: I0319 11:58:17.887269 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9"] Mar 19 11:58:17.973745 master-0 kubenswrapper[6932]: I0319 11:58:17.973650 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdx6s\" (UniqueName: \"kubernetes.io/projected/12809811-c9df-4e77-8c12-309831b8975d-kube-api-access-bdx6s\") pod \"machine-config-controller-b4f87c5b9-lg6h9\" (UID: \"12809811-c9df-4e77-8c12-309831b8975d\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9" Mar 19 11:58:17.973976 master-0 kubenswrapper[6932]: I0319 11:58:17.973884 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12809811-c9df-4e77-8c12-309831b8975d-proxy-tls\") pod \"machine-config-controller-b4f87c5b9-lg6h9\" (UID: \"12809811-c9df-4e77-8c12-309831b8975d\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9" Mar 19 11:58:17.974085 master-0 kubenswrapper[6932]: I0319 11:58:17.974037 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/12809811-c9df-4e77-8c12-309831b8975d-mcc-auth-proxy-config\") pod \"machine-config-controller-b4f87c5b9-lg6h9\" (UID: \"12809811-c9df-4e77-8c12-309831b8975d\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9" Mar 19 11:58:18.077040 master-0 kubenswrapper[6932]: I0319 11:58:18.076972 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdx6s\" (UniqueName: \"kubernetes.io/projected/12809811-c9df-4e77-8c12-309831b8975d-kube-api-access-bdx6s\") pod \"machine-config-controller-b4f87c5b9-lg6h9\" (UID: \"12809811-c9df-4e77-8c12-309831b8975d\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9" Mar 19 11:58:18.077383 master-0 kubenswrapper[6932]: I0319 11:58:18.077363 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12809811-c9df-4e77-8c12-309831b8975d-proxy-tls\") pod \"machine-config-controller-b4f87c5b9-lg6h9\" (UID: \"12809811-c9df-4e77-8c12-309831b8975d\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9" Mar 19 11:58:18.077481 master-0 kubenswrapper[6932]: I0319 11:58:18.077464 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/12809811-c9df-4e77-8c12-309831b8975d-mcc-auth-proxy-config\") pod \"machine-config-controller-b4f87c5b9-lg6h9\" (UID: \"12809811-c9df-4e77-8c12-309831b8975d\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9" Mar 19 11:58:18.078544 master-0 kubenswrapper[6932]: I0319 11:58:18.078498 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/12809811-c9df-4e77-8c12-309831b8975d-mcc-auth-proxy-config\") pod \"machine-config-controller-b4f87c5b9-lg6h9\" (UID: \"12809811-c9df-4e77-8c12-309831b8975d\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9" Mar 19 11:58:18.088082 master-0 kubenswrapper[6932]: I0319 11:58:18.087996 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12809811-c9df-4e77-8c12-309831b8975d-proxy-tls\") pod \"machine-config-controller-b4f87c5b9-lg6h9\" (UID: \"12809811-c9df-4e77-8c12-309831b8975d\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9" Mar 19 11:58:18.095613 master-0 kubenswrapper[6932]: I0319 11:58:18.095564 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdx6s\" (UniqueName: \"kubernetes.io/projected/12809811-c9df-4e77-8c12-309831b8975d-kube-api-access-bdx6s\") pod \"machine-config-controller-b4f87c5b9-lg6h9\" (UID: \"12809811-c9df-4e77-8c12-309831b8975d\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9" Mar 19 11:58:18.195486 master-0 kubenswrapper[6932]: I0319 11:58:18.195432 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9" Mar 19 11:58:18.258540 master-0 kubenswrapper[6932]: I0319 11:58:18.258485 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" event={"ID":"c8d8a09f-22d5-4f16-84d6-d5f2c504c949","Type":"ContainerStarted","Data":"9cf161eff839742cccd14aa8cf4bca338fbe76e5cacdbdfee7194d89b53caee1"} Mar 19 11:58:18.284114 master-0 kubenswrapper[6932]: I0319 11:58:18.284000 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" podStartSLOduration=2.283965265 podStartE2EDuration="2.283965265s" podCreationTimestamp="2026-03-19 11:58:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:58:18.281336931 +0000 UTC m=+322.640397153" watchObservedRunningTime="2026-03-19 11:58:18.283965265 +0000 UTC m=+322.643025487" Mar 19 11:58:18.639140 master-0 kubenswrapper[6932]: I0319 11:58:18.638656 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9"] Mar 19 11:58:18.643602 master-0 kubenswrapper[6932]: W0319 11:58:18.643548 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12809811_c9df_4e77_8c12_309831b8975d.slice/crio-adc1d294cb2c4faeba2726e706421944c88f312613f6f9484ed976e0c65190f9 WatchSource:0}: Error finding container adc1d294cb2c4faeba2726e706421944c88f312613f6f9484ed976e0c65190f9: Status 404 returned error can't find the container with id adc1d294cb2c4faeba2726e706421944c88f312613f6f9484ed976e0c65190f9 Mar 19 11:58:18.999802 master-0 kubenswrapper[6932]: I0319 11:58:18.999745 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w2fqh" Mar 19 11:58:19.000153 master-0 kubenswrapper[6932]: I0319 11:58:19.000138 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w2fqh" Mar 19 11:58:19.065715 master-0 kubenswrapper[6932]: I0319 11:58:19.065655 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-89rdt"] Mar 19 11:58:19.067435 master-0 kubenswrapper[6932]: I0319 11:58:19.067412 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-89rdt" Mar 19 11:58:19.068061 master-0 kubenswrapper[6932]: I0319 11:58:19.067998 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-b4bf74f6-llsdf"] Mar 19 11:58:19.069047 master-0 kubenswrapper[6932]: I0319 11:58:19.069018 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-llsdf" Mar 19 11:58:19.069645 master-0 kubenswrapper[6932]: I0319 11:58:19.069615 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-sx7wj" Mar 19 11:58:19.070445 master-0 kubenswrapper[6932]: I0319 11:58:19.070386 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7dcf5569b5-kpmgt"] Mar 19 11:58:19.071239 master-0 kubenswrapper[6932]: I0319 11:58:19.071214 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:58:19.073246 master-0 kubenswrapper[6932]: I0319 11:58:19.073199 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 19 11:58:19.075883 master-0 kubenswrapper[6932]: I0319 11:58:19.075859 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 11:58:19.076437 master-0 kubenswrapper[6932]: I0319 11:58:19.076414 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 11:58:19.076615 master-0 kubenswrapper[6932]: I0319 11:58:19.076597 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 11:58:19.076896 master-0 kubenswrapper[6932]: I0319 11:58:19.076870 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 11:58:19.077094 master-0 kubenswrapper[6932]: I0319 11:58:19.077073 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 11:58:19.077262 master-0 kubenswrapper[6932]: I0319 11:58:19.077243 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 11:58:19.087173 master-0 kubenswrapper[6932]: I0319 11:58:19.087128 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-b4bf74f6-llsdf"] Mar 19 11:58:19.093844 master-0 kubenswrapper[6932]: I0319 11:58:19.093780 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-89rdt"] Mar 19 11:58:19.105891 master-0 kubenswrapper[6932]: I0319 11:58:19.105833 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e2ad29ad-70ef-43c6-91f6-02f04d145673-stats-auth\") pod \"router-default-7dcf5569b5-kpmgt\" (UID: \"e2ad29ad-70ef-43c6-91f6-02f04d145673\") " pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:58:19.106125 master-0 kubenswrapper[6932]: I0319 11:58:19.105892 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2ad29ad-70ef-43c6-91f6-02f04d145673-metrics-certs\") pod \"router-default-7dcf5569b5-kpmgt\" (UID: \"e2ad29ad-70ef-43c6-91f6-02f04d145673\") " pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:58:19.106125 master-0 kubenswrapper[6932]: I0319 11:58:19.105954 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fvvj\" (UniqueName: \"kubernetes.io/projected/e65e2a2f-16b5-44a3-9860-741f70188ab5-kube-api-access-4fvvj\") pod \"network-check-source-b4bf74f6-llsdf\" (UID: \"e65e2a2f-16b5-44a3-9860-741f70188ab5\") " pod="openshift-network-diagnostics/network-check-source-b4bf74f6-llsdf" Mar 19 11:58:19.106125 master-0 kubenswrapper[6932]: I0319 11:58:19.106015 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trcb7\" (UniqueName: \"kubernetes.io/projected/e2ad29ad-70ef-43c6-91f6-02f04d145673-kube-api-access-trcb7\") pod \"router-default-7dcf5569b5-kpmgt\" (UID: \"e2ad29ad-70ef-43c6-91f6-02f04d145673\") " pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:58:19.106125 master-0 kubenswrapper[6932]: I0319 11:58:19.106045 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e2ad29ad-70ef-43c6-91f6-02f04d145673-default-certificate\") pod \"router-default-7dcf5569b5-kpmgt\" (UID: \"e2ad29ad-70ef-43c6-91f6-02f04d145673\") " pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:58:19.106125 master-0 kubenswrapper[6932]: I0319 11:58:19.106066 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2ad29ad-70ef-43c6-91f6-02f04d145673-service-ca-bundle\") pod \"router-default-7dcf5569b5-kpmgt\" (UID: \"e2ad29ad-70ef-43c6-91f6-02f04d145673\") " pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:58:19.106125 master-0 kubenswrapper[6932]: I0319 11:58:19.106100 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9778f8f5-b0d1-4967-9776-9db758bba3af-tls-certificates\") pod \"prometheus-operator-admission-webhook-69c6b55594-89rdt\" (UID: \"9778f8f5-b0d1-4967-9776-9db758bba3af\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-89rdt" Mar 19 11:58:19.207490 master-0 kubenswrapper[6932]: I0319 11:58:19.206964 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trcb7\" (UniqueName: \"kubernetes.io/projected/e2ad29ad-70ef-43c6-91f6-02f04d145673-kube-api-access-trcb7\") pod \"router-default-7dcf5569b5-kpmgt\" (UID: \"e2ad29ad-70ef-43c6-91f6-02f04d145673\") " pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:58:19.207878 master-0 kubenswrapper[6932]: I0319 11:58:19.207858 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e2ad29ad-70ef-43c6-91f6-02f04d145673-default-certificate\") pod \"router-default-7dcf5569b5-kpmgt\" (UID: \"e2ad29ad-70ef-43c6-91f6-02f04d145673\") " pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:58:19.207985 master-0 kubenswrapper[6932]: I0319 11:58:19.207969 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2ad29ad-70ef-43c6-91f6-02f04d145673-service-ca-bundle\") pod \"router-default-7dcf5569b5-kpmgt\" (UID: \"e2ad29ad-70ef-43c6-91f6-02f04d145673\") " pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:58:19.208105 master-0 kubenswrapper[6932]: I0319 11:58:19.208085 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9778f8f5-b0d1-4967-9776-9db758bba3af-tls-certificates\") pod \"prometheus-operator-admission-webhook-69c6b55594-89rdt\" (UID: \"9778f8f5-b0d1-4967-9776-9db758bba3af\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-89rdt" Mar 19 11:58:19.208208 master-0 kubenswrapper[6932]: I0319 11:58:19.208196 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e2ad29ad-70ef-43c6-91f6-02f04d145673-stats-auth\") pod \"router-default-7dcf5569b5-kpmgt\" (UID: \"e2ad29ad-70ef-43c6-91f6-02f04d145673\") " pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:58:19.208300 master-0 kubenswrapper[6932]: I0319 11:58:19.208285 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2ad29ad-70ef-43c6-91f6-02f04d145673-metrics-certs\") pod \"router-default-7dcf5569b5-kpmgt\" (UID: \"e2ad29ad-70ef-43c6-91f6-02f04d145673\") " pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:58:19.208404 master-0 kubenswrapper[6932]: I0319 11:58:19.208387 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fvvj\" (UniqueName: \"kubernetes.io/projected/e65e2a2f-16b5-44a3-9860-741f70188ab5-kube-api-access-4fvvj\") pod \"network-check-source-b4bf74f6-llsdf\" (UID: \"e65e2a2f-16b5-44a3-9860-741f70188ab5\") " pod="openshift-network-diagnostics/network-check-source-b4bf74f6-llsdf" Mar 19 11:58:19.208987 master-0 kubenswrapper[6932]: I0319 11:58:19.208965 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2ad29ad-70ef-43c6-91f6-02f04d145673-service-ca-bundle\") pod \"router-default-7dcf5569b5-kpmgt\" (UID: \"e2ad29ad-70ef-43c6-91f6-02f04d145673\") " pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:58:19.214141 master-0 kubenswrapper[6932]: I0319 11:58:19.214089 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e2ad29ad-70ef-43c6-91f6-02f04d145673-stats-auth\") pod \"router-default-7dcf5569b5-kpmgt\" (UID: \"e2ad29ad-70ef-43c6-91f6-02f04d145673\") " pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:58:19.214141 master-0 kubenswrapper[6932]: I0319 11:58:19.214089 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e2ad29ad-70ef-43c6-91f6-02f04d145673-default-certificate\") pod \"router-default-7dcf5569b5-kpmgt\" (UID: \"e2ad29ad-70ef-43c6-91f6-02f04d145673\") " pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:58:19.219924 master-0 kubenswrapper[6932]: I0319 11:58:19.215887 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2ad29ad-70ef-43c6-91f6-02f04d145673-metrics-certs\") pod \"router-default-7dcf5569b5-kpmgt\" (UID: \"e2ad29ad-70ef-43c6-91f6-02f04d145673\") " pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:58:19.219924 master-0 kubenswrapper[6932]: I0319 11:58:19.216174 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9778f8f5-b0d1-4967-9776-9db758bba3af-tls-certificates\") pod \"prometheus-operator-admission-webhook-69c6b55594-89rdt\" (UID: \"9778f8f5-b0d1-4967-9776-9db758bba3af\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-89rdt" Mar 19 11:58:19.285578 master-0 kubenswrapper[6932]: I0319 11:58:19.285486 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9" event={"ID":"12809811-c9df-4e77-8c12-309831b8975d","Type":"ContainerStarted","Data":"fcc618a4f9dd32b51a4220c93ac97e1a7c6ca0b446c867501c2b1f13f5181292"} Mar 19 11:58:19.285578 master-0 kubenswrapper[6932]: I0319 11:58:19.285585 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9" event={"ID":"12809811-c9df-4e77-8c12-309831b8975d","Type":"ContainerStarted","Data":"b1fe1e1d136b8b5dfe456072ee9ba23578a05f4adff0afdc42e54e181c3e9663"} Mar 19 11:58:19.286279 master-0 kubenswrapper[6932]: I0319 11:58:19.285602 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9" event={"ID":"12809811-c9df-4e77-8c12-309831b8975d","Type":"ContainerStarted","Data":"adc1d294cb2c4faeba2726e706421944c88f312613f6f9484ed976e0c65190f9"} Mar 19 11:58:19.301119 master-0 kubenswrapper[6932]: E0319 11:58:19.301030 6932 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbfd5667_f6f4_4c7c_92b2_ea4ecd0f0103.slice/crio-conmon-1e03cf34918a9df69167cf80628d7425b9668e84a411e5ec9a6953baa6d085c1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbfd5667_f6f4_4c7c_92b2_ea4ecd0f0103.slice/crio-1e03cf34918a9df69167cf80628d7425b9668e84a411e5ec9a6953baa6d085c1.scope\": RecentStats: unable to find data in memory cache]" Mar 19 11:58:19.372854 master-0 kubenswrapper[6932]: I0319 11:58:19.372774 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trcb7\" (UniqueName: \"kubernetes.io/projected/e2ad29ad-70ef-43c6-91f6-02f04d145673-kube-api-access-trcb7\") pod \"router-default-7dcf5569b5-kpmgt\" (UID: \"e2ad29ad-70ef-43c6-91f6-02f04d145673\") " pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:58:19.383281 master-0 kubenswrapper[6932]: I0319 11:58:19.383229 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fvvj\" (UniqueName: \"kubernetes.io/projected/e65e2a2f-16b5-44a3-9860-741f70188ab5-kube-api-access-4fvvj\") pod \"network-check-source-b4bf74f6-llsdf\" (UID: \"e65e2a2f-16b5-44a3-9860-741f70188ab5\") " pod="openshift-network-diagnostics/network-check-source-b4bf74f6-llsdf" Mar 19 11:58:19.417607 master-0 kubenswrapper[6932]: I0319 11:58:19.417527 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9" podStartSLOduration=2.41750696 podStartE2EDuration="2.41750696s" podCreationTimestamp="2026-03-19 11:58:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:58:19.415693996 +0000 UTC m=+323.774754228" watchObservedRunningTime="2026-03-19 11:58:19.41750696 +0000 UTC m=+323.776567182" Mar 19 11:58:19.432169 master-0 kubenswrapper[6932]: I0319 11:58:19.432096 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-89rdt" Mar 19 11:58:19.450305 master-0 kubenswrapper[6932]: I0319 11:58:19.450218 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-llsdf" Mar 19 11:58:19.468625 master-0 kubenswrapper[6932]: I0319 11:58:19.468467 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:58:19.491499 master-0 kubenswrapper[6932]: I0319 11:58:19.491447 6932 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 11:58:19.710394 master-0 kubenswrapper[6932]: I0319 11:58:19.708393 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 19 11:58:19.710394 master-0 kubenswrapper[6932]: I0319 11:58:19.708601 6932 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-2-master-0" podUID="02e4c691-68ed-49f6-a8f6-c87579b65f07" containerName="installer" containerID="cri-o://6384c9ad839ac528dae2f57abae0c1588c98fa5a01e5dd526fb6a9b608bc80e3" gracePeriod=30 Mar 19 11:58:19.896321 master-0 kubenswrapper[6932]: I0319 11:58:19.896247 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-89rdt"] Mar 19 11:58:19.912873 master-0 kubenswrapper[6932]: I0319 11:58:19.912830 6932 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 11:58:20.003629 master-0 kubenswrapper[6932]: I0319 11:58:20.003534 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-b4bf74f6-llsdf"] Mar 19 11:58:20.049588 master-0 kubenswrapper[6932]: I0319 11:58:20.049528 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w2fqh" podUID="cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103" containerName="registry-server" probeResult="failure" output=< Mar 19 11:58:20.049588 master-0 kubenswrapper[6932]: timeout: failed to connect service ":50051" within 1s Mar 19 11:58:20.049588 master-0 kubenswrapper[6932]: > Mar 19 11:58:20.297109 master-0 kubenswrapper[6932]: I0319 11:58:20.297038 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-89rdt" event={"ID":"9778f8f5-b0d1-4967-9776-9db758bba3af","Type":"ContainerStarted","Data":"5a3a840584953aa05811a73f7731c28fab3047c34d3f28cfbf2a20aad97cf6c3"} Mar 19 11:58:20.306757 master-0 kubenswrapper[6932]: I0319 11:58:20.300281 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-llsdf" event={"ID":"e65e2a2f-16b5-44a3-9860-741f70188ab5","Type":"ContainerStarted","Data":"5fb0a9f9c8fd3f420f18a1ab9f8cdd9ceb13ec3805707917d2722d5b3bc13317"} Mar 19 11:58:20.306757 master-0 kubenswrapper[6932]: I0319 11:58:20.300313 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-llsdf" event={"ID":"e65e2a2f-16b5-44a3-9860-741f70188ab5","Type":"ContainerStarted","Data":"a2b791c04ceadd3a171b7dda7655ef7534b61d799b6ce663909c8e48a8e61525"} Mar 19 11:58:20.306757 master-0 kubenswrapper[6932]: I0319 11:58:20.305766 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" event={"ID":"e2ad29ad-70ef-43c6-91f6-02f04d145673","Type":"ContainerStarted","Data":"e01c0d4f6330ee155cedce051137a3842f3cbc1b8b4039503e3a3e9fd950bf49"} Mar 19 11:58:20.335855 master-0 kubenswrapper[6932]: I0319 11:58:20.334867 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-llsdf" podStartSLOduration=374.334848505 podStartE2EDuration="6m14.334848505s" podCreationTimestamp="2026-03-19 11:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:58:20.326898021 +0000 UTC m=+324.685958233" watchObservedRunningTime="2026-03-19 11:58:20.334848505 +0000 UTC m=+324.693908717" Mar 19 11:58:22.319643 master-0 kubenswrapper[6932]: I0319 11:58:22.319589 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-ltk8s"] Mar 19 11:58:22.320400 master-0 kubenswrapper[6932]: I0319 11:58:22.320379 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-ltk8s" Mar 19 11:58:22.324499 master-0 kubenswrapper[6932]: I0319 11:58:22.324455 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 11:58:22.324620 master-0 kubenswrapper[6932]: I0319 11:58:22.324594 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 11:58:22.324670 master-0 kubenswrapper[6932]: I0319 11:58:22.324661 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-5jj8d" Mar 19 11:58:22.410099 master-0 kubenswrapper[6932]: I0319 11:58:22.409578 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6870ccc7-2094-48d8-9238-f486a4b8d5af-node-bootstrap-token\") pod \"machine-config-server-ltk8s\" (UID: \"6870ccc7-2094-48d8-9238-f486a4b8d5af\") " pod="openshift-machine-config-operator/machine-config-server-ltk8s" Mar 19 11:58:22.410099 master-0 kubenswrapper[6932]: I0319 11:58:22.409642 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dg9r\" (UniqueName: \"kubernetes.io/projected/6870ccc7-2094-48d8-9238-f486a4b8d5af-kube-api-access-9dg9r\") pod \"machine-config-server-ltk8s\" (UID: \"6870ccc7-2094-48d8-9238-f486a4b8d5af\") " pod="openshift-machine-config-operator/machine-config-server-ltk8s" Mar 19 11:58:22.410099 master-0 kubenswrapper[6932]: I0319 11:58:22.409984 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6870ccc7-2094-48d8-9238-f486a4b8d5af-certs\") pod \"machine-config-server-ltk8s\" (UID: \"6870ccc7-2094-48d8-9238-f486a4b8d5af\") " pod="openshift-machine-config-operator/machine-config-server-ltk8s" Mar 19 11:58:22.482340 master-0 kubenswrapper[6932]: I0319 11:58:22.482214 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5885bfd7f4-gqd94_732989c5-1b89-46f0-9917-b68613f7f005/authentication-operator/0.log" Mar 19 11:58:22.511830 master-0 kubenswrapper[6932]: I0319 11:58:22.511720 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6870ccc7-2094-48d8-9238-f486a4b8d5af-certs\") pod \"machine-config-server-ltk8s\" (UID: \"6870ccc7-2094-48d8-9238-f486a4b8d5af\") " pod="openshift-machine-config-operator/machine-config-server-ltk8s" Mar 19 11:58:22.511830 master-0 kubenswrapper[6932]: I0319 11:58:22.511844 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6870ccc7-2094-48d8-9238-f486a4b8d5af-node-bootstrap-token\") pod \"machine-config-server-ltk8s\" (UID: \"6870ccc7-2094-48d8-9238-f486a4b8d5af\") " pod="openshift-machine-config-operator/machine-config-server-ltk8s" Mar 19 11:58:22.512258 master-0 kubenswrapper[6932]: I0319 11:58:22.511871 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dg9r\" (UniqueName: \"kubernetes.io/projected/6870ccc7-2094-48d8-9238-f486a4b8d5af-kube-api-access-9dg9r\") pod \"machine-config-server-ltk8s\" (UID: \"6870ccc7-2094-48d8-9238-f486a4b8d5af\") " pod="openshift-machine-config-operator/machine-config-server-ltk8s" Mar 19 11:58:22.515047 master-0 kubenswrapper[6932]: I0319 11:58:22.515010 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6870ccc7-2094-48d8-9238-f486a4b8d5af-certs\") pod \"machine-config-server-ltk8s\" (UID: \"6870ccc7-2094-48d8-9238-f486a4b8d5af\") " pod="openshift-machine-config-operator/machine-config-server-ltk8s" Mar 19 11:58:22.515116 master-0 kubenswrapper[6932]: I0319 11:58:22.515055 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6870ccc7-2094-48d8-9238-f486a4b8d5af-node-bootstrap-token\") pod \"machine-config-server-ltk8s\" (UID: \"6870ccc7-2094-48d8-9238-f486a4b8d5af\") " pod="openshift-machine-config-operator/machine-config-server-ltk8s" Mar 19 11:58:22.529501 master-0 kubenswrapper[6932]: I0319 11:58:22.529405 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dg9r\" (UniqueName: \"kubernetes.io/projected/6870ccc7-2094-48d8-9238-f486a4b8d5af-kube-api-access-9dg9r\") pod \"machine-config-server-ltk8s\" (UID: \"6870ccc7-2094-48d8-9238-f486a4b8d5af\") " pod="openshift-machine-config-operator/machine-config-server-ltk8s" Mar 19 11:58:22.648182 master-0 kubenswrapper[6932]: I0319 11:58:22.648123 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-ltk8s" Mar 19 11:58:22.665496 master-0 kubenswrapper[6932]: W0319 11:58:22.665435 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6870ccc7_2094_48d8_9238_f486a4b8d5af.slice/crio-d2ecc9d2456937963c5f6bc8147a2bbe973205b8a12f3d89082b348a330ba2e2 WatchSource:0}: Error finding container d2ecc9d2456937963c5f6bc8147a2bbe973205b8a12f3d89082b348a330ba2e2: Status 404 returned error can't find the container with id d2ecc9d2456937963c5f6bc8147a2bbe973205b8a12f3d89082b348a330ba2e2 Mar 19 11:58:22.680668 master-0 kubenswrapper[6932]: I0319 11:58:22.680587 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5885bfd7f4-gqd94_732989c5-1b89-46f0-9917-b68613f7f005/authentication-operator/1.log" Mar 19 11:58:23.076324 master-0 kubenswrapper[6932]: I0319 11:58:23.076176 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-899bc59d8-xxr9r_e45616db-f7dd-4a08-847f-abf2759d9fa4/fix-audit-permissions/0.log" Mar 19 11:58:23.292814 master-0 kubenswrapper[6932]: I0319 11:58:23.292716 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-899bc59d8-xxr9r_e45616db-f7dd-4a08-847f-abf2759d9fa4/oauth-apiserver/0.log" Mar 19 11:58:23.332067 master-0 kubenswrapper[6932]: I0319 11:58:23.331893 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" event={"ID":"e2ad29ad-70ef-43c6-91f6-02f04d145673","Type":"ContainerStarted","Data":"0ac0754966059812979329159fb7716f54973438d4fc6c9efcd8d090ab1b57ef"} Mar 19 11:58:23.334044 master-0 kubenswrapper[6932]: I0319 11:58:23.333935 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-89rdt" event={"ID":"9778f8f5-b0d1-4967-9776-9db758bba3af","Type":"ContainerStarted","Data":"4e1d59b1d5aa7c274c7737d0cc804d01a87bae208b2aa16354b6abb8b4018154"} Mar 19 11:58:23.334044 master-0 kubenswrapper[6932]: I0319 11:58:23.334008 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-89rdt" Mar 19 11:58:23.336426 master-0 kubenswrapper[6932]: I0319 11:58:23.336374 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-ltk8s" event={"ID":"6870ccc7-2094-48d8-9238-f486a4b8d5af","Type":"ContainerStarted","Data":"b88bf8999b2145a9574329dec3b7634c0545b155bea578287f201879edb8c912"} Mar 19 11:58:23.336426 master-0 kubenswrapper[6932]: I0319 11:58:23.336419 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-ltk8s" event={"ID":"6870ccc7-2094-48d8-9238-f486a4b8d5af","Type":"ContainerStarted","Data":"d2ecc9d2456937963c5f6bc8147a2bbe973205b8a12f3d89082b348a330ba2e2"} Mar 19 11:58:23.345122 master-0 kubenswrapper[6932]: I0319 11:58:23.345060 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-89rdt" Mar 19 11:58:23.358908 master-0 kubenswrapper[6932]: I0319 11:58:23.358782 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podStartSLOduration=276.658436212 podStartE2EDuration="4m39.358767311s" podCreationTimestamp="2026-03-19 11:53:44 +0000 UTC" firstStartedPulling="2026-03-19 11:58:19.491317372 +0000 UTC m=+323.850377594" lastFinishedPulling="2026-03-19 11:58:22.191648451 +0000 UTC m=+326.550708693" observedRunningTime="2026-03-19 11:58:23.357070999 +0000 UTC m=+327.716131251" watchObservedRunningTime="2026-03-19 11:58:23.358767311 +0000 UTC m=+327.717827533" Mar 19 11:58:23.380624 master-0 kubenswrapper[6932]: I0319 11:58:23.380554 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-ltk8s" podStartSLOduration=1.380540519 podStartE2EDuration="1.380540519s" podCreationTimestamp="2026-03-19 11:58:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:58:23.375169718 +0000 UTC m=+327.734229930" watchObservedRunningTime="2026-03-19 11:58:23.380540519 +0000 UTC m=+327.739600741" Mar 19 11:58:23.404785 master-0 kubenswrapper[6932]: I0319 11:58:23.396804 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-89rdt" podStartSLOduration=254.10475813 podStartE2EDuration="4m16.396786654s" podCreationTimestamp="2026-03-19 11:54:07 +0000 UTC" firstStartedPulling="2026-03-19 11:58:19.90057716 +0000 UTC m=+324.259637382" lastFinishedPulling="2026-03-19 11:58:22.192605674 +0000 UTC m=+326.551665906" observedRunningTime="2026-03-19 11:58:23.394831216 +0000 UTC m=+327.753891448" watchObservedRunningTime="2026-03-19 11:58:23.396786654 +0000 UTC m=+327.755846876" Mar 19 11:58:23.470191 master-0 kubenswrapper[6932]: I0319 11:58:23.470078 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:58:23.472340 master-0 kubenswrapper[6932]: I0319 11:58:23.472276 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:23.472340 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:23.472340 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:23.472340 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:23.472650 master-0 kubenswrapper[6932]: I0319 11:58:23.472347 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:23.480892 master-0 kubenswrapper[6932]: I0319 11:58:23.480341 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-8544cbcf9c-9w7hc_8fe4839d-cef4-4ec9-b146-2ae9b76d8a76/etcd-operator/0.log" Mar 19 11:58:23.678045 master-0 kubenswrapper[6932]: I0319 11:58:23.677900 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-8544cbcf9c-9w7hc_8fe4839d-cef4-4ec9-b146-2ae9b76d8a76/etcd-operator/1.log" Mar 19 11:58:23.711092 master-0 kubenswrapper[6932]: I0319 11:58:23.711010 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 19 11:58:23.712112 master-0 kubenswrapper[6932]: I0319 11:58:23.712084 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:58:23.720426 master-0 kubenswrapper[6932]: I0319 11:58:23.720372 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 19 11:58:23.831193 master-0 kubenswrapper[6932]: I0319 11:58:23.831131 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c576a88-6da4-43e9-a373-0df27a029f59-var-lock\") pod \"installer-3-master-0\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:58:23.831193 master-0 kubenswrapper[6932]: I0319 11:58:23.831199 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access\") pod \"installer-3-master-0\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:58:23.831509 master-0 kubenswrapper[6932]: I0319 11:58:23.831242 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c576a88-6da4-43e9-a373-0df27a029f59-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:58:23.880658 master-0 kubenswrapper[6932]: I0319 11:58:23.880608 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_6bde080b-3820-463f-a27d-9fb9a7843d5d/installer/0.log" Mar 19 11:58:23.932953 master-0 kubenswrapper[6932]: I0319 11:58:23.932791 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c576a88-6da4-43e9-a373-0df27a029f59-var-lock\") pod \"installer-3-master-0\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:58:23.932953 master-0 kubenswrapper[6932]: I0319 11:58:23.932860 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access\") pod \"installer-3-master-0\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:58:23.932953 master-0 kubenswrapper[6932]: I0319 11:58:23.932924 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c576a88-6da4-43e9-a373-0df27a029f59-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:58:23.933220 master-0 kubenswrapper[6932]: I0319 11:58:23.932958 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c576a88-6da4-43e9-a373-0df27a029f59-var-lock\") pod \"installer-3-master-0\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:58:23.933442 master-0 kubenswrapper[6932]: I0319 11:58:23.933402 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c576a88-6da4-43e9-a373-0df27a029f59-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:58:23.949103 master-0 kubenswrapper[6932]: I0319 11:58:23.949064 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access\") pod \"installer-3-master-0\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:58:24.043708 master-0 kubenswrapper[6932]: I0319 11:58:24.043616 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr"] Mar 19 11:58:24.044896 master-0 kubenswrapper[6932]: I0319 11:58:24.044868 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:58:24.047339 master-0 kubenswrapper[6932]: I0319 11:58:24.047295 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 19 11:58:24.050673 master-0 kubenswrapper[6932]: I0319 11:58:24.050639 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-8skrb" Mar 19 11:58:24.050786 master-0 kubenswrapper[6932]: I0319 11:58:24.050647 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 19 11:58:24.050786 master-0 kubenswrapper[6932]: I0319 11:58:24.050711 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 19 11:58:24.057201 master-0 kubenswrapper[6932]: I0319 11:58:24.057159 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr"] Mar 19 11:58:24.062515 master-0 kubenswrapper[6932]: I0319 11:58:24.062455 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:58:24.100252 master-0 kubenswrapper[6932]: I0319 11:58:24.100157 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-8b68b9d9b-pjc7h_39d3ac31-9259-454b-8e1c-e23024f8f2b2/kube-apiserver-operator/0.log" Mar 19 11:58:24.139632 master-0 kubenswrapper[6932]: I0319 11:58:24.139558 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zndqq\" (UniqueName: \"kubernetes.io/projected/f4aad0ff-e6cd-4c43-9561-80a14fee4712-kube-api-access-zndqq\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:58:24.139632 master-0 kubenswrapper[6932]: I0319 11:58:24.139621 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f4aad0ff-e6cd-4c43-9561-80a14fee4712-metrics-client-ca\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:58:24.139925 master-0 kubenswrapper[6932]: I0319 11:58:24.139650 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4aad0ff-e6cd-4c43-9561-80a14fee4712-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:58:24.139925 master-0 kubenswrapper[6932]: I0319 11:58:24.139700 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f4aad0ff-e6cd-4c43-9561-80a14fee4712-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:58:24.241044 master-0 kubenswrapper[6932]: I0319 11:58:24.240977 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zndqq\" (UniqueName: \"kubernetes.io/projected/f4aad0ff-e6cd-4c43-9561-80a14fee4712-kube-api-access-zndqq\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:58:24.241044 master-0 kubenswrapper[6932]: I0319 11:58:24.241043 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f4aad0ff-e6cd-4c43-9561-80a14fee4712-metrics-client-ca\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:58:24.241285 master-0 kubenswrapper[6932]: I0319 11:58:24.241250 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4aad0ff-e6cd-4c43-9561-80a14fee4712-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:58:24.241383 master-0 kubenswrapper[6932]: I0319 11:58:24.241350 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f4aad0ff-e6cd-4c43-9561-80a14fee4712-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:58:24.243774 master-0 kubenswrapper[6932]: E0319 11:58:24.241624 6932 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Mar 19 11:58:24.243774 master-0 kubenswrapper[6932]: E0319 11:58:24.241772 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4aad0ff-e6cd-4c43-9561-80a14fee4712-prometheus-operator-tls podName:f4aad0ff-e6cd-4c43-9561-80a14fee4712 nodeName:}" failed. No retries permitted until 2026-03-19 11:58:24.741683019 +0000 UTC m=+329.100743241 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/f4aad0ff-e6cd-4c43-9561-80a14fee4712-prometheus-operator-tls") pod "prometheus-operator-6c8df6d4b-xfwkr" (UID: "f4aad0ff-e6cd-4c43-9561-80a14fee4712") : secret "prometheus-operator-tls" not found Mar 19 11:58:24.243774 master-0 kubenswrapper[6932]: I0319 11:58:24.242290 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f4aad0ff-e6cd-4c43-9561-80a14fee4712-metrics-client-ca\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:58:24.247806 master-0 kubenswrapper[6932]: I0319 11:58:24.245932 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f4aad0ff-e6cd-4c43-9561-80a14fee4712-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:58:24.306783 master-0 kubenswrapper[6932]: I0319 11:58:24.306604 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zndqq\" (UniqueName: \"kubernetes.io/projected/f4aad0ff-e6cd-4c43-9561-80a14fee4712-kube-api-access-zndqq\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:58:24.320576 master-0 kubenswrapper[6932]: I0319 11:58:24.320198 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-8b68b9d9b-pjc7h_39d3ac31-9259-454b-8e1c-e23024f8f2b2/kube-apiserver-operator/1.log" Mar 19 11:58:24.471815 master-0 kubenswrapper[6932]: I0319 11:58:24.471676 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:24.471815 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:24.471815 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:24.471815 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:24.471815 master-0 kubenswrapper[6932]: I0319 11:58:24.471762 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:24.480117 master-0 kubenswrapper[6932]: I0319 11:58:24.480057 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_bootstrap-kube-apiserver-master-0_49fac1b46a11e49501805e891baae4a9/setup/0.log" Mar 19 11:58:24.563228 master-0 kubenswrapper[6932]: I0319 11:58:24.563142 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 19 11:58:24.683111 master-0 kubenswrapper[6932]: I0319 11:58:24.683057 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_bootstrap-kube-apiserver-master-0_49fac1b46a11e49501805e891baae4a9/kube-apiserver/0.log" Mar 19 11:58:24.748273 master-0 kubenswrapper[6932]: I0319 11:58:24.748136 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4aad0ff-e6cd-4c43-9561-80a14fee4712-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:58:24.751981 master-0 kubenswrapper[6932]: I0319 11:58:24.751951 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4aad0ff-e6cd-4c43-9561-80a14fee4712-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:58:24.874836 master-0 kubenswrapper[6932]: I0319 11:58:24.874790 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_bootstrap-kube-apiserver-master-0_49fac1b46a11e49501805e891baae4a9/kube-apiserver-insecure-readyz/0.log" Mar 19 11:58:24.961099 master-0 kubenswrapper[6932]: I0319 11:58:24.961036 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:58:25.082506 master-0 kubenswrapper[6932]: I0319 11:58:25.082451 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_8e508a43-99db-49eb-bf4e-e3e6a0f49761/installer/0.log" Mar 19 11:58:25.279475 master-0 kubenswrapper[6932]: I0319 11:58:25.279435 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_02e4c691-68ed-49f6-a8f6-c87579b65f07/installer/0.log" Mar 19 11:58:25.364791 master-0 kubenswrapper[6932]: I0319 11:58:25.364654 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"1c576a88-6da4-43e9-a373-0df27a029f59","Type":"ContainerStarted","Data":"ddc94e7a85827e965bf13353b20a1293018b59883ea4cdbc55de2c9639ca8732"} Mar 19 11:58:25.364791 master-0 kubenswrapper[6932]: I0319 11:58:25.364707 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"1c576a88-6da4-43e9-a373-0df27a029f59","Type":"ContainerStarted","Data":"a6afce628b759b4a9bfac575d71074779271063662545462b264e568ed7ab2d8"} Mar 19 11:58:25.375405 master-0 kubenswrapper[6932]: I0319 11:58:25.375353 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr"] Mar 19 11:58:25.383118 master-0 kubenswrapper[6932]: W0319 11:58:25.383062 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4aad0ff_e6cd_4c43_9561_80a14fee4712.slice/crio-a00871e023b42142484ad987a4c2956151fb53dc58e2ab128b59501bf258f39e WatchSource:0}: Error finding container a00871e023b42142484ad987a4c2956151fb53dc58e2ab128b59501bf258f39e: Status 404 returned error can't find the container with id a00871e023b42142484ad987a4c2956151fb53dc58e2ab128b59501bf258f39e Mar 19 11:58:25.390311 master-0 kubenswrapper[6932]: I0319 11:58:25.388652 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-3-master-0" podStartSLOduration=2.388629808 podStartE2EDuration="2.388629808s" podCreationTimestamp="2026-03-19 11:58:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:58:25.386064517 +0000 UTC m=+329.745124749" watchObservedRunningTime="2026-03-19 11:58:25.388629808 +0000 UTC m=+329.747690030" Mar 19 11:58:25.472385 master-0 kubenswrapper[6932]: I0319 11:58:25.472306 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:25.472385 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:25.472385 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:25.472385 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:25.473251 master-0 kubenswrapper[6932]: I0319 11:58:25.472407 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:25.479524 master-0 kubenswrapper[6932]: I0319 11:58:25.479462 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-ff989d6cc-5gvgh_dbcbba74-ac53-4724-a217-4d9b85e7c1db/kube-controller-manager-operator/0.log" Mar 19 11:58:25.680230 master-0 kubenswrapper[6932]: I0319 11:58:25.680098 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-ff989d6cc-5gvgh_dbcbba74-ac53-4724-a217-4d9b85e7c1db/kube-controller-manager-operator/1.log" Mar 19 11:58:25.880668 master-0 kubenswrapper[6932]: I0319 11:58:25.880622 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-controller-manager-master-0_46f265536aba6292ead501bc9b49f327/kube-controller-manager/2.log" Mar 19 11:58:26.278831 master-0 kubenswrapper[6932]: I0319 11:58:26.278784 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-controller-manager-master-0_46f265536aba6292ead501bc9b49f327/kube-controller-manager/3.log" Mar 19 11:58:26.380971 master-0 kubenswrapper[6932]: I0319 11:58:26.380851 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" event={"ID":"f4aad0ff-e6cd-4c43-9561-80a14fee4712","Type":"ContainerStarted","Data":"a00871e023b42142484ad987a4c2956151fb53dc58e2ab128b59501bf258f39e"} Mar 19 11:58:26.471834 master-0 kubenswrapper[6932]: I0319 11:58:26.471787 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:26.471834 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:26.471834 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:26.471834 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:26.472231 master-0 kubenswrapper[6932]: I0319 11:58:26.471855 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:26.543418 master-0 kubenswrapper[6932]: I0319 11:58:26.543093 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-controller-manager-master-0_46f265536aba6292ead501bc9b49f327/cluster-policy-controller/0.log" Mar 19 11:58:27.472187 master-0 kubenswrapper[6932]: I0319 11:58:27.472106 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:27.472187 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:27.472187 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:27.472187 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:27.472617 master-0 kubenswrapper[6932]: I0319 11:58:27.472187 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:27.589194 master-0 kubenswrapper[6932]: I0319 11:58:27.585637 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-scheduler-master-0_c83737980b9ee109184b1d78e942cf36/kube-scheduler/0.log" Mar 19 11:58:28.472040 master-0 kubenswrapper[6932]: I0319 11:58:28.471950 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:28.472040 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:28.472040 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:28.472040 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:28.472040 master-0 kubenswrapper[6932]: I0319 11:58:28.472023 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:28.974288 master-0 kubenswrapper[6932]: I0319 11:58:28.974220 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-scheduler-master-0_c83737980b9ee109184b1d78e942cf36/kube-scheduler/1.log" Mar 19 11:58:29.043222 master-0 kubenswrapper[6932]: I0319 11:58:29.043144 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w2fqh" Mar 19 11:58:29.087690 master-0 kubenswrapper[6932]: I0319 11:58:29.087627 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w2fqh" Mar 19 11:58:29.469229 master-0 kubenswrapper[6932]: I0319 11:58:29.469147 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:58:29.471683 master-0 kubenswrapper[6932]: I0319 11:58:29.471641 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:29.471683 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:29.471683 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:29.471683 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:29.471865 master-0 kubenswrapper[6932]: I0319 11:58:29.471819 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:29.854426 master-0 kubenswrapper[6932]: E0319 11:58:29.854263 6932 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbfd5667_f6f4_4c7c_92b2_ea4ecd0f0103.slice/crio-conmon-1e03cf34918a9df69167cf80628d7425b9668e84a411e5ec9a6953baa6d085c1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbfd5667_f6f4_4c7c_92b2_ea4ecd0f0103.slice/crio-1e03cf34918a9df69167cf80628d7425b9668e84a411e5ec9a6953baa6d085c1.scope\": RecentStats: unable to find data in memory cache]" Mar 19 11:58:30.474917 master-0 kubenswrapper[6932]: I0319 11:58:30.474843 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:30.474917 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:30.474917 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:30.474917 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:30.474917 master-0 kubenswrapper[6932]: I0319 11:58:30.474917 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:31.137479 master-0 kubenswrapper[6932]: I0319 11:58:31.137400 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_870e66ff-82ed-4c91-8197-dddcb78048c2/installer/0.log" Mar 19 11:58:31.177978 master-0 kubenswrapper[6932]: I0319 11:58:31.177928 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-dddff6458-4wj9n_9b61ea14-a7ea-49f3-9df4-5655765ddf7c/kube-scheduler-operator-container/0.log" Mar 19 11:58:31.239527 master-0 kubenswrapper[6932]: I0319 11:58:31.239437 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-dddff6458-4wj9n_9b61ea14-a7ea-49f3-9df4-5655765ddf7c/kube-scheduler-operator-container/1.log" Mar 19 11:58:31.269470 master-0 kubenswrapper[6932]: I0319 11:58:31.269431 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-d65958b8-6hsqn_66f88242-8b0b-4790-bbb6-445c19b34ee7/openshift-apiserver-operator/0.log" Mar 19 11:58:31.318789 master-0 kubenswrapper[6932]: I0319 11:58:31.316600 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-d65958b8-6hsqn_66f88242-8b0b-4790-bbb6-445c19b34ee7/openshift-apiserver-operator/1.log" Mar 19 11:58:31.334098 master-0 kubenswrapper[6932]: I0319 11:58:31.331637 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-f67f6868b-chx8j_e48b5aa9-293e-4222-91ff-7640addeca4c/fix-audit-permissions/0.log" Mar 19 11:58:31.365888 master-0 kubenswrapper[6932]: I0319 11:58:31.362376 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-f67f6868b-chx8j_e48b5aa9-293e-4222-91ff-7640addeca4c/openshift-apiserver/0.log" Mar 19 11:58:31.372851 master-0 kubenswrapper[6932]: I0319 11:58:31.372052 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-f67f6868b-chx8j_e48b5aa9-293e-4222-91ff-7640addeca4c/openshift-apiserver-check-endpoints/0.log" Mar 19 11:58:31.378711 master-0 kubenswrapper[6932]: I0319 11:58:31.378669 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-8544cbcf9c-9w7hc_8fe4839d-cef4-4ec9-b146-2ae9b76d8a76/etcd-operator/0.log" Mar 19 11:58:31.385373 master-0 kubenswrapper[6932]: I0319 11:58:31.385322 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-8544cbcf9c-9w7hc_8fe4839d-cef4-4ec9-b146-2ae9b76d8a76/etcd-operator/1.log" Mar 19 11:58:31.398180 master-0 kubenswrapper[6932]: I0319 11:58:31.397655 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_catalog-operator-68f85b4d6c-n5gr9_cf08ab4f-c203-4c16-9826-8cc049f4af31/catalog-operator/0.log" Mar 19 11:58:31.410843 master-0 kubenswrapper[6932]: I0319 11:58:31.410604 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_olm-operator-5c9796789-l9sw9_716c2176-50f9-4c4f-af0e-4c7973457df2/olm-operator/0.log" Mar 19 11:58:31.472941 master-0 kubenswrapper[6932]: I0319 11:58:31.472844 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:31.472941 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:31.472941 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:31.472941 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:31.473249 master-0 kubenswrapper[6932]: I0319 11:58:31.472966 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:31.547229 master-0 kubenswrapper[6932]: I0319 11:58:31.547193 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-7b95f86987-jq5vq_e5078f17-bc65-460f-9f18-8c506db6840b/kube-rbac-proxy/0.log" Mar 19 11:58:31.764277 master-0 kubenswrapper[6932]: I0319 11:58:31.764240 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-7b95f86987-jq5vq_e5078f17-bc65-460f-9f18-8c506db6840b/package-server-manager/0.log" Mar 19 11:58:31.951791 master-0 kubenswrapper[6932]: I0319 11:58:31.951656 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_packageserver-bbf67c86c-n58nq_6d41245b-33d4-40f8-bbe1-6d2247e2e335/packageserver/0.log" Mar 19 11:58:32.432432 master-0 kubenswrapper[6932]: I0319 11:58:32.432294 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" event={"ID":"f4aad0ff-e6cd-4c43-9561-80a14fee4712","Type":"ContainerStarted","Data":"4998a7ab15e3384ea86c4520bf8098b7c21484fd66d50792e63ee1727f98126d"} Mar 19 11:58:32.432432 master-0 kubenswrapper[6932]: I0319 11:58:32.432343 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" event={"ID":"f4aad0ff-e6cd-4c43-9561-80a14fee4712","Type":"ContainerStarted","Data":"15ec53301686dbd164c9bed50c2fa4de455e2d5d72eeae4605a336118cb62f4b"} Mar 19 11:58:32.471692 master-0 kubenswrapper[6932]: I0319 11:58:32.471612 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:32.471692 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:32.471692 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:32.471692 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:32.471692 master-0 kubenswrapper[6932]: I0319 11:58:32.471692 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:32.541616 master-0 kubenswrapper[6932]: I0319 11:58:32.541500 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" podStartSLOduration=2.660092231 podStartE2EDuration="8.54146176s" podCreationTimestamp="2026-03-19 11:58:24 +0000 UTC" firstStartedPulling="2026-03-19 11:58:25.391019677 +0000 UTC m=+329.750079899" lastFinishedPulling="2026-03-19 11:58:31.272389206 +0000 UTC m=+335.631449428" observedRunningTime="2026-03-19 11:58:32.540117328 +0000 UTC m=+336.899177560" watchObservedRunningTime="2026-03-19 11:58:32.54146176 +0000 UTC m=+336.900521982" Mar 19 11:58:33.475185 master-0 kubenswrapper[6932]: I0319 11:58:33.472041 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:33.475185 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:33.475185 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:33.475185 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:33.475185 master-0 kubenswrapper[6932]: I0319 11:58:33.472105 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:34.465199 master-0 kubenswrapper[6932]: I0319 11:58:34.465121 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn"] Mar 19 11:58:34.466397 master-0 kubenswrapper[6932]: I0319 11:58:34.466362 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:58:34.471092 master-0 kubenswrapper[6932]: I0319 11:58:34.471027 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:34.471092 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:34.471092 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:34.471092 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:34.471406 master-0 kubenswrapper[6932]: I0319 11:58:34.471099 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:34.475973 master-0 kubenswrapper[6932]: I0319 11:58:34.475926 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 19 11:58:34.476610 master-0 kubenswrapper[6932]: I0319 11:58:34.476581 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 19 11:58:34.476784 master-0 kubenswrapper[6932]: I0319 11:58:34.476763 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-68jgh" Mar 19 11:58:34.490405 master-0 kubenswrapper[6932]: I0319 11:58:34.490336 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn"] Mar 19 11:58:34.513310 master-0 kubenswrapper[6932]: I0319 11:58:34.513240 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-pnb9m"] Mar 19 11:58:34.514498 master-0 kubenswrapper[6932]: I0319 11:58:34.514466 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:34.516489 master-0 kubenswrapper[6932]: I0319 11:58:34.516464 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-strbt" Mar 19 11:58:34.516644 master-0 kubenswrapper[6932]: I0319 11:58:34.516621 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 19 11:58:34.516793 master-0 kubenswrapper[6932]: I0319 11:58:34.516772 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 19 11:58:34.545776 master-0 kubenswrapper[6932]: I0319 11:58:34.545702 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f"] Mar 19 11:58:34.547295 master-0 kubenswrapper[6932]: I0319 11:58:34.547266 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:58:34.551959 master-0 kubenswrapper[6932]: I0319 11:58:34.551913 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 19 11:58:34.552089 master-0 kubenswrapper[6932]: I0319 11:58:34.552052 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 19 11:58:34.552148 master-0 kubenswrapper[6932]: I0319 11:58:34.552079 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 19 11:58:34.552194 master-0 kubenswrapper[6932]: I0319 11:58:34.552174 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-zf4zz" Mar 19 11:58:34.589870 master-0 kubenswrapper[6932]: I0319 11:58:34.583379 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f"] Mar 19 11:58:34.625801 master-0 kubenswrapper[6932]: I0319 11:58:34.625650 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-metrics-client-ca\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:58:34.625801 master-0 kubenswrapper[6932]: I0319 11:58:34.625740 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:58:34.625801 master-0 kubenswrapper[6932]: I0319 11:58:34.625772 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:58:34.625801 master-0 kubenswrapper[6932]: I0319 11:58:34.625804 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:34.626234 master-0 kubenswrapper[6932]: I0319 11:58:34.625832 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:58:34.626234 master-0 kubenswrapper[6932]: I0319 11:58:34.625854 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-wtmp\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:34.626234 master-0 kubenswrapper[6932]: I0319 11:58:34.625873 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-metrics-client-ca\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:58:34.626234 master-0 kubenswrapper[6932]: I0319 11:58:34.625895 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d06b230b-db67-4afc-8d10-2c33ad568462-metrics-client-ca\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:34.626234 master-0 kubenswrapper[6932]: I0319 11:58:34.625940 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d06b230b-db67-4afc-8d10-2c33ad568462-sys\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:34.626234 master-0 kubenswrapper[6932]: I0319 11:58:34.625964 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bbtl\" (UniqueName: \"kubernetes.io/projected/d06b230b-db67-4afc-8d10-2c33ad568462-kube-api-access-4bbtl\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:34.626234 master-0 kubenswrapper[6932]: I0319 11:58:34.625983 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-textfile\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:34.626234 master-0 kubenswrapper[6932]: I0319 11:58:34.626002 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwrd5\" (UniqueName: \"kubernetes.io/projected/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-api-access-kwrd5\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:58:34.626234 master-0 kubenswrapper[6932]: I0319 11:58:34.626058 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:58:34.626234 master-0 kubenswrapper[6932]: I0319 11:58:34.626082 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:58:34.626234 master-0 kubenswrapper[6932]: I0319 11:58:34.626100 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d06b230b-db67-4afc-8d10-2c33ad568462-root\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:34.626234 master-0 kubenswrapper[6932]: I0319 11:58:34.626118 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-tls\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:34.626234 master-0 kubenswrapper[6932]: I0319 11:58:34.626136 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lscpq\" (UniqueName: \"kubernetes.io/projected/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-kube-api-access-lscpq\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:58:34.626234 master-0 kubenswrapper[6932]: I0319 11:58:34.626154 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-volume-directive-shadow\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:58:34.728179 master-0 kubenswrapper[6932]: I0319 11:58:34.727942 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-metrics-client-ca\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:58:34.728469 master-0 kubenswrapper[6932]: I0319 11:58:34.728238 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d06b230b-db67-4afc-8d10-2c33ad568462-metrics-client-ca\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:34.728469 master-0 kubenswrapper[6932]: I0319 11:58:34.728315 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d06b230b-db67-4afc-8d10-2c33ad568462-sys\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:34.728469 master-0 kubenswrapper[6932]: I0319 11:58:34.728416 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d06b230b-db67-4afc-8d10-2c33ad568462-sys\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:34.728570 master-0 kubenswrapper[6932]: I0319 11:58:34.728513 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bbtl\" (UniqueName: \"kubernetes.io/projected/d06b230b-db67-4afc-8d10-2c33ad568462-kube-api-access-4bbtl\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:34.728650 master-0 kubenswrapper[6932]: I0319 11:58:34.728610 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-textfile\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:34.728703 master-0 kubenswrapper[6932]: I0319 11:58:34.728657 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwrd5\" (UniqueName: \"kubernetes.io/projected/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-api-access-kwrd5\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:58:34.728893 master-0 kubenswrapper[6932]: I0319 11:58:34.728827 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:58:34.728893 master-0 kubenswrapper[6932]: I0319 11:58:34.728874 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:58:34.728979 master-0 kubenswrapper[6932]: I0319 11:58:34.728898 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d06b230b-db67-4afc-8d10-2c33ad568462-root\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:34.728979 master-0 kubenswrapper[6932]: I0319 11:58:34.728930 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-tls\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:34.728979 master-0 kubenswrapper[6932]: I0319 11:58:34.728953 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-volume-directive-shadow\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:58:34.728979 master-0 kubenswrapper[6932]: I0319 11:58:34.728979 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lscpq\" (UniqueName: \"kubernetes.io/projected/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-kube-api-access-lscpq\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:58:34.729149 master-0 kubenswrapper[6932]: I0319 11:58:34.729008 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-metrics-client-ca\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:58:34.729149 master-0 kubenswrapper[6932]: I0319 11:58:34.729045 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:58:34.729149 master-0 kubenswrapper[6932]: I0319 11:58:34.729091 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:58:34.729149 master-0 kubenswrapper[6932]: I0319 11:58:34.729112 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:34.729149 master-0 kubenswrapper[6932]: I0319 11:58:34.729145 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:58:34.729304 master-0 kubenswrapper[6932]: I0319 11:58:34.729165 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-wtmp\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:34.729338 master-0 kubenswrapper[6932]: I0319 11:58:34.729291 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-metrics-client-ca\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:58:34.729431 master-0 kubenswrapper[6932]: I0319 11:58:34.729405 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-wtmp\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:34.729490 master-0 kubenswrapper[6932]: E0319 11:58:34.729460 6932 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Mar 19 11:58:34.729556 master-0 kubenswrapper[6932]: E0319 11:58:34.729535 6932 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-tls podName:d06b230b-db67-4afc-8d10-2c33ad568462 nodeName:}" failed. No retries permitted until 2026-03-19 11:58:35.22951441 +0000 UTC m=+339.588574632 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-tls") pod "node-exporter-pnb9m" (UID: "d06b230b-db67-4afc-8d10-2c33ad568462") : secret "node-exporter-tls" not found Mar 19 11:58:34.729845 master-0 kubenswrapper[6932]: I0319 11:58:34.729815 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-volume-directive-shadow\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:58:34.730717 master-0 kubenswrapper[6932]: I0319 11:58:34.730659 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d06b230b-db67-4afc-8d10-2c33ad568462-root\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:34.732788 master-0 kubenswrapper[6932]: I0319 11:58:34.731131 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-metrics-client-ca\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:58:34.732788 master-0 kubenswrapper[6932]: I0319 11:58:34.731328 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-textfile\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:34.732788 master-0 kubenswrapper[6932]: I0319 11:58:34.732091 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d06b230b-db67-4afc-8d10-2c33ad568462-metrics-client-ca\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:34.733175 master-0 kubenswrapper[6932]: I0319 11:58:34.732800 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:58:34.733985 master-0 kubenswrapper[6932]: I0319 11:58:34.733870 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:34.735805 master-0 kubenswrapper[6932]: I0319 11:58:34.734387 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:58:34.741497 master-0 kubenswrapper[6932]: I0319 11:58:34.740248 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:58:34.741863 master-0 kubenswrapper[6932]: I0319 11:58:34.741556 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:58:34.745375 master-0 kubenswrapper[6932]: I0319 11:58:34.745311 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:58:34.754537 master-0 kubenswrapper[6932]: I0319 11:58:34.754418 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bbtl\" (UniqueName: \"kubernetes.io/projected/d06b230b-db67-4afc-8d10-2c33ad568462-kube-api-access-4bbtl\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:34.762162 master-0 kubenswrapper[6932]: I0319 11:58:34.762045 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwrd5\" (UniqueName: \"kubernetes.io/projected/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-api-access-kwrd5\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:58:34.767835 master-0 kubenswrapper[6932]: I0319 11:58:34.766480 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lscpq\" (UniqueName: \"kubernetes.io/projected/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-kube-api-access-lscpq\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:58:34.786692 master-0 kubenswrapper[6932]: I0319 11:58:34.786575 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:58:34.876349 master-0 kubenswrapper[6932]: I0319 11:58:34.876259 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:58:35.167408 master-0 kubenswrapper[6932]: I0319 11:58:35.162708 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn"] Mar 19 11:58:35.256804 master-0 kubenswrapper[6932]: I0319 11:58:35.256715 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-tls\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:35.261977 master-0 kubenswrapper[6932]: I0319 11:58:35.261877 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-tls\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:35.447226 master-0 kubenswrapper[6932]: I0319 11:58:35.446016 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:58:35.452304 master-0 kubenswrapper[6932]: I0319 11:58:35.452230 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" event={"ID":"dedf55c4-eeda-4955-aafe-db1fdb8c4a48","Type":"ContainerStarted","Data":"1d78352cc0165381add58d3c3353316e1f87d336df48d27209cc869230009c97"} Mar 19 11:58:35.452304 master-0 kubenswrapper[6932]: I0319 11:58:35.452291 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" event={"ID":"dedf55c4-eeda-4955-aafe-db1fdb8c4a48","Type":"ContainerStarted","Data":"8ce4d1f32bb0cc2bd6719ecbc1bb660798af73ec1a021eb215e32bb686d9ba1b"} Mar 19 11:58:35.458220 master-0 kubenswrapper[6932]: I0319 11:58:35.458114 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f"] Mar 19 11:58:35.467135 master-0 kubenswrapper[6932]: W0319 11:58:35.467079 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d63d5a8_f45d_4678_824d_5534b2bcd6ca.slice/crio-17a53907f75f6dae7caa627268daa345a6154ff885830dae9a1873ed761e0552 WatchSource:0}: Error finding container 17a53907f75f6dae7caa627268daa345a6154ff885830dae9a1873ed761e0552: Status 404 returned error can't find the container with id 17a53907f75f6dae7caa627268daa345a6154ff885830dae9a1873ed761e0552 Mar 19 11:58:35.473237 master-0 kubenswrapper[6932]: I0319 11:58:35.472036 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:35.473237 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:35.473237 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:35.473237 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:35.473237 master-0 kubenswrapper[6932]: I0319 11:58:35.472083 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:35.474918 master-0 kubenswrapper[6932]: W0319 11:58:35.474878 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd06b230b_db67_4afc_8d10_2c33ad568462.slice/crio-6956b1980f4b04ce367cfbad3aeea7396b54e1517e031f7afdbbd760960fd241 WatchSource:0}: Error finding container 6956b1980f4b04ce367cfbad3aeea7396b54e1517e031f7afdbbd760960fd241: Status 404 returned error can't find the container with id 6956b1980f4b04ce367cfbad3aeea7396b54e1517e031f7afdbbd760960fd241 Mar 19 11:58:36.471447 master-0 kubenswrapper[6932]: I0319 11:58:36.471354 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:36.471447 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:36.471447 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:36.471447 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:36.471447 master-0 kubenswrapper[6932]: I0319 11:58:36.471414 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:36.474922 master-0 kubenswrapper[6932]: I0319 11:58:36.474797 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pnb9m" event={"ID":"d06b230b-db67-4afc-8d10-2c33ad568462","Type":"ContainerStarted","Data":"6956b1980f4b04ce367cfbad3aeea7396b54e1517e031f7afdbbd760960fd241"} Mar 19 11:58:36.477296 master-0 kubenswrapper[6932]: I0319 11:58:36.477266 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" event={"ID":"2d63d5a8-f45d-4678-824d-5534b2bcd6ca","Type":"ContainerStarted","Data":"17a53907f75f6dae7caa627268daa345a6154ff885830dae9a1873ed761e0552"} Mar 19 11:58:36.479750 master-0 kubenswrapper[6932]: I0319 11:58:36.479690 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" event={"ID":"dedf55c4-eeda-4955-aafe-db1fdb8c4a48","Type":"ContainerStarted","Data":"3e8a4d46cdb3eb84efa71b944a67a65a4d2d18a18f2d60b24fd7f5ac87379bea"} Mar 19 11:58:37.471592 master-0 kubenswrapper[6932]: I0319 11:58:37.471200 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:37.471592 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:37.471592 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:37.471592 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:37.471592 master-0 kubenswrapper[6932]: I0319 11:58:37.471280 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:37.488453 master-0 kubenswrapper[6932]: I0319 11:58:37.488363 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pnb9m" event={"ID":"d06b230b-db67-4afc-8d10-2c33ad568462","Type":"ContainerStarted","Data":"1593c64a217270ac3de7b41e76b88277976a5cada758c58a75da6710a40d48b7"} Mar 19 11:58:37.492707 master-0 kubenswrapper[6932]: I0319 11:58:37.492672 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" event={"ID":"dedf55c4-eeda-4955-aafe-db1fdb8c4a48","Type":"ContainerStarted","Data":"e11c663a8c4c5ed3ee534e8439d1f4cc0761648feed77fbdff9ca96485e30b0c"} Mar 19 11:58:37.601015 master-0 kubenswrapper[6932]: I0319 11:58:37.600904 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" podStartSLOduration=1.9333819330000002 podStartE2EDuration="3.600871772s" podCreationTimestamp="2026-03-19 11:58:34 +0000 UTC" firstStartedPulling="2026-03-19 11:58:35.484551064 +0000 UTC m=+339.843611286" lastFinishedPulling="2026-03-19 11:58:37.152040913 +0000 UTC m=+341.511101125" observedRunningTime="2026-03-19 11:58:37.596402493 +0000 UTC m=+341.955462725" watchObservedRunningTime="2026-03-19 11:58:37.600871772 +0000 UTC m=+341.959931994" Mar 19 11:58:38.472588 master-0 kubenswrapper[6932]: I0319 11:58:38.472223 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:38.472588 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:38.472588 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:38.472588 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:38.472588 master-0 kubenswrapper[6932]: I0319 11:58:38.472296 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:38.506944 master-0 kubenswrapper[6932]: I0319 11:58:38.506854 6932 generic.go:334] "Generic (PLEG): container finished" podID="d06b230b-db67-4afc-8d10-2c33ad568462" containerID="1593c64a217270ac3de7b41e76b88277976a5cada758c58a75da6710a40d48b7" exitCode=0 Mar 19 11:58:38.507190 master-0 kubenswrapper[6932]: I0319 11:58:38.507142 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pnb9m" event={"ID":"d06b230b-db67-4afc-8d10-2c33ad568462","Type":"ContainerDied","Data":"1593c64a217270ac3de7b41e76b88277976a5cada758c58a75da6710a40d48b7"} Mar 19 11:58:39.471946 master-0 kubenswrapper[6932]: I0319 11:58:39.471840 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:39.471946 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:39.471946 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:39.471946 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:39.472346 master-0 kubenswrapper[6932]: I0319 11:58:39.471982 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:39.516468 master-0 kubenswrapper[6932]: I0319 11:58:39.516296 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" event={"ID":"2d63d5a8-f45d-4678-824d-5534b2bcd6ca","Type":"ContainerStarted","Data":"bc462c03b7ec1ee932de17e7e0c676bfd7eae714df0816cf7e7754b23c54b641"} Mar 19 11:58:39.516468 master-0 kubenswrapper[6932]: I0319 11:58:39.516371 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" event={"ID":"2d63d5a8-f45d-4678-824d-5534b2bcd6ca","Type":"ContainerStarted","Data":"a3bdbddbc269d1bab289a37778c24f74a6a39d199ddc73eacfce412e10284cd7"} Mar 19 11:58:39.516468 master-0 kubenswrapper[6932]: I0319 11:58:39.516391 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" event={"ID":"2d63d5a8-f45d-4678-824d-5534b2bcd6ca","Type":"ContainerStarted","Data":"cb57e2e6e049c7ff4e3925b322386bddc016ac19a8d87423f5d7fadba2e897c7"} Mar 19 11:58:39.519061 master-0 kubenswrapper[6932]: I0319 11:58:39.519027 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pnb9m" event={"ID":"d06b230b-db67-4afc-8d10-2c33ad568462","Type":"ContainerStarted","Data":"f8c3cc294465a47f1b0ecb41058bdc4d09d0c575910cd08e562d3a9841b4fd8d"} Mar 19 11:58:39.519145 master-0 kubenswrapper[6932]: I0319 11:58:39.519064 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pnb9m" event={"ID":"d06b230b-db67-4afc-8d10-2c33ad568462","Type":"ContainerStarted","Data":"3d9543dac4edd57faf98c3ed5a8a99196b7de65dcd0fbd940db452f032c931d8"} Mar 19 11:58:39.549873 master-0 kubenswrapper[6932]: I0319 11:58:39.549767 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" podStartSLOduration=2.215062401 podStartE2EDuration="5.549745423s" podCreationTimestamp="2026-03-19 11:58:34 +0000 UTC" firstStartedPulling="2026-03-19 11:58:35.471382694 +0000 UTC m=+339.830442916" lastFinishedPulling="2026-03-19 11:58:38.806065716 +0000 UTC m=+343.165125938" observedRunningTime="2026-03-19 11:58:39.547047549 +0000 UTC m=+343.906107791" watchObservedRunningTime="2026-03-19 11:58:39.549745423 +0000 UTC m=+343.908805645" Mar 19 11:58:39.569292 master-0 kubenswrapper[6932]: I0319 11:58:39.569176 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-pnb9m" podStartSLOduration=3.898748144 podStartE2EDuration="5.569147934s" podCreationTimestamp="2026-03-19 11:58:34 +0000 UTC" firstStartedPulling="2026-03-19 11:58:35.477524953 +0000 UTC m=+339.836585165" lastFinishedPulling="2026-03-19 11:58:37.147924743 +0000 UTC m=+341.506984955" observedRunningTime="2026-03-19 11:58:39.566287375 +0000 UTC m=+343.925347607" watchObservedRunningTime="2026-03-19 11:58:39.569147934 +0000 UTC m=+343.928208166" Mar 19 11:58:39.909451 master-0 kubenswrapper[6932]: I0319 11:58:39.909273 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5"] Mar 19 11:58:39.910384 master-0 kubenswrapper[6932]: I0319 11:58:39.910352 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:58:39.918346 master-0 kubenswrapper[6932]: I0319 11:58:39.915789 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 19 11:58:39.918346 master-0 kubenswrapper[6932]: I0319 11:58:39.916007 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-w94s8" Mar 19 11:58:39.918346 master-0 kubenswrapper[6932]: I0319 11:58:39.916094 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-6ro5itlgu7nag" Mar 19 11:58:39.918346 master-0 kubenswrapper[6932]: I0319 11:58:39.916293 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 19 11:58:39.918346 master-0 kubenswrapper[6932]: I0319 11:58:39.916486 6932 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 19 11:58:39.918346 master-0 kubenswrapper[6932]: I0319 11:58:39.916652 6932 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 19 11:58:39.920777 master-0 kubenswrapper[6932]: I0319 11:58:39.920318 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5"] Mar 19 11:58:39.996051 master-0 kubenswrapper[6932]: E0319 11:58:39.995965 6932 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbfd5667_f6f4_4c7c_92b2_ea4ecd0f0103.slice/crio-conmon-1e03cf34918a9df69167cf80628d7425b9668e84a411e5ec9a6953baa6d085c1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbfd5667_f6f4_4c7c_92b2_ea4ecd0f0103.slice/crio-1e03cf34918a9df69167cf80628d7425b9668e84a411e5ec9a6953baa6d085c1.scope\": RecentStats: unable to find data in memory cache]" Mar 19 11:58:40.073757 master-0 kubenswrapper[6932]: I0319 11:58:40.073589 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-metrics-server-audit-profiles\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:58:40.074136 master-0 kubenswrapper[6932]: I0319 11:58:40.073897 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-server-tls\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:58:40.074226 master-0 kubenswrapper[6932]: I0319 11:58:40.074176 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:58:40.074521 master-0 kubenswrapper[6932]: I0319 11:58:40.074469 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/5f8c022c-7871-4765-971f-dcafa39357c9-audit-log\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:58:40.074792 master-0 kubenswrapper[6932]: I0319 11:58:40.074757 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g997b\" (UniqueName: \"kubernetes.io/projected/5f8c022c-7871-4765-971f-dcafa39357c9-kube-api-access-g997b\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:58:40.074964 master-0 kubenswrapper[6932]: I0319 11:58:40.074935 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-client-certs\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:58:40.075210 master-0 kubenswrapper[6932]: I0319 11:58:40.075172 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-client-ca-bundle\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:58:40.176841 master-0 kubenswrapper[6932]: I0319 11:58:40.176565 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-client-certs\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:58:40.176841 master-0 kubenswrapper[6932]: I0319 11:58:40.176682 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-client-ca-bundle\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:58:40.176841 master-0 kubenswrapper[6932]: I0319 11:58:40.176716 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-metrics-server-audit-profiles\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:58:40.176841 master-0 kubenswrapper[6932]: I0319 11:58:40.176779 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-server-tls\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:58:40.176841 master-0 kubenswrapper[6932]: I0319 11:58:40.176810 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:58:40.176841 master-0 kubenswrapper[6932]: I0319 11:58:40.176843 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/5f8c022c-7871-4765-971f-dcafa39357c9-audit-log\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:58:40.177482 master-0 kubenswrapper[6932]: I0319 11:58:40.176873 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g997b\" (UniqueName: \"kubernetes.io/projected/5f8c022c-7871-4765-971f-dcafa39357c9-kube-api-access-g997b\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:58:40.177992 master-0 kubenswrapper[6932]: I0319 11:58:40.177934 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/5f8c022c-7871-4765-971f-dcafa39357c9-audit-log\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:58:40.178376 master-0 kubenswrapper[6932]: I0319 11:58:40.178311 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:58:40.178848 master-0 kubenswrapper[6932]: I0319 11:58:40.178779 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-metrics-server-audit-profiles\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:58:40.181291 master-0 kubenswrapper[6932]: I0319 11:58:40.181072 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-client-ca-bundle\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:58:40.181705 master-0 kubenswrapper[6932]: I0319 11:58:40.181650 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-client-certs\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:58:40.191467 master-0 kubenswrapper[6932]: I0319 11:58:40.191423 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-server-tls\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:58:40.194389 master-0 kubenswrapper[6932]: I0319 11:58:40.194332 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g997b\" (UniqueName: \"kubernetes.io/projected/5f8c022c-7871-4765-971f-dcafa39357c9-kube-api-access-g997b\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:58:40.239867 master-0 kubenswrapper[6932]: I0319 11:58:40.239774 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:58:40.476777 master-0 kubenswrapper[6932]: I0319 11:58:40.476686 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:40.476777 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:40.476777 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:40.476777 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:40.477477 master-0 kubenswrapper[6932]: I0319 11:58:40.476806 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:40.710256 master-0 kubenswrapper[6932]: I0319 11:58:40.710186 6932 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5"] Mar 19 11:58:40.711362 master-0 kubenswrapper[6932]: W0319 11:58:40.711318 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f8c022c_7871_4765_971f_dcafa39357c9.slice/crio-d26de8d7725dab288840f8eb4631a12a8821676d8fd47b0810577c9ee4f7e3b9 WatchSource:0}: Error finding container d26de8d7725dab288840f8eb4631a12a8821676d8fd47b0810577c9ee4f7e3b9: Status 404 returned error can't find the container with id d26de8d7725dab288840f8eb4631a12a8821676d8fd47b0810577c9ee4f7e3b9 Mar 19 11:58:41.472187 master-0 kubenswrapper[6932]: I0319 11:58:41.472115 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:41.472187 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:41.472187 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:41.472187 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:41.472671 master-0 kubenswrapper[6932]: I0319 11:58:41.472627 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:41.535034 master-0 kubenswrapper[6932]: I0319 11:58:41.534977 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" event={"ID":"5f8c022c-7871-4765-971f-dcafa39357c9","Type":"ContainerStarted","Data":"d26de8d7725dab288840f8eb4631a12a8821676d8fd47b0810577c9ee4f7e3b9"} Mar 19 11:58:42.472817 master-0 kubenswrapper[6932]: I0319 11:58:42.472134 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:42.472817 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:42.472817 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:42.472817 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:42.472817 master-0 kubenswrapper[6932]: I0319 11:58:42.472260 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:43.276937 master-0 kubenswrapper[6932]: E0319 11:58:43.276865 6932 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbfd5667_f6f4_4c7c_92b2_ea4ecd0f0103.slice/crio-1e03cf34918a9df69167cf80628d7425b9668e84a411e5ec9a6953baa6d085c1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbfd5667_f6f4_4c7c_92b2_ea4ecd0f0103.slice/crio-conmon-1e03cf34918a9df69167cf80628d7425b9668e84a411e5ec9a6953baa6d085c1.scope\": RecentStats: unable to find data in memory cache]" Mar 19 11:58:43.472314 master-0 kubenswrapper[6932]: I0319 11:58:43.472180 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:43.472314 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:43.472314 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:43.472314 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:43.472800 master-0 kubenswrapper[6932]: I0319 11:58:43.472328 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:43.552882 master-0 kubenswrapper[6932]: I0319 11:58:43.552628 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" event={"ID":"5f8c022c-7871-4765-971f-dcafa39357c9","Type":"ContainerStarted","Data":"72a73422baa1bf839575e34cbe90d73e29ac03ab1786e2499f59601d503649f6"} Mar 19 11:58:43.580045 master-0 kubenswrapper[6932]: I0319 11:58:43.579902 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" podStartSLOduration=2.709419125 podStartE2EDuration="4.579710257s" podCreationTimestamp="2026-03-19 11:58:39 +0000 UTC" firstStartedPulling="2026-03-19 11:58:40.71426315 +0000 UTC m=+345.073323362" lastFinishedPulling="2026-03-19 11:58:42.584554272 +0000 UTC m=+346.943614494" observedRunningTime="2026-03-19 11:58:43.578341363 +0000 UTC m=+347.937401615" watchObservedRunningTime="2026-03-19 11:58:43.579710257 +0000 UTC m=+347.938770519" Mar 19 11:58:44.473153 master-0 kubenswrapper[6932]: I0319 11:58:44.473044 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:44.473153 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:44.473153 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:44.473153 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:44.473697 master-0 kubenswrapper[6932]: I0319 11:58:44.473162 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:44.890285 master-0 kubenswrapper[6932]: I0319 11:58:44.890107 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 19 11:58:45.461687 master-0 kubenswrapper[6932]: I0319 11:58:45.461555 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_02e4c691-68ed-49f6-a8f6-c87579b65f07/installer/0.log" Mar 19 11:58:45.462184 master-0 kubenswrapper[6932]: I0319 11:58:45.461929 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 11:58:45.471501 master-0 kubenswrapper[6932]: I0319 11:58:45.470434 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/02e4c691-68ed-49f6-a8f6-c87579b65f07-kube-api-access\") pod \"02e4c691-68ed-49f6-a8f6-c87579b65f07\" (UID: \"02e4c691-68ed-49f6-a8f6-c87579b65f07\") " Mar 19 11:58:45.471501 master-0 kubenswrapper[6932]: I0319 11:58:45.470497 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/02e4c691-68ed-49f6-a8f6-c87579b65f07-var-lock\") pod \"02e4c691-68ed-49f6-a8f6-c87579b65f07\" (UID: \"02e4c691-68ed-49f6-a8f6-c87579b65f07\") " Mar 19 11:58:45.471501 master-0 kubenswrapper[6932]: I0319 11:58:45.470635 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/02e4c691-68ed-49f6-a8f6-c87579b65f07-kubelet-dir\") pod \"02e4c691-68ed-49f6-a8f6-c87579b65f07\" (UID: \"02e4c691-68ed-49f6-a8f6-c87579b65f07\") " Mar 19 11:58:45.471501 master-0 kubenswrapper[6932]: I0319 11:58:45.471077 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02e4c691-68ed-49f6-a8f6-c87579b65f07-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "02e4c691-68ed-49f6-a8f6-c87579b65f07" (UID: "02e4c691-68ed-49f6-a8f6-c87579b65f07"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:58:45.472024 master-0 kubenswrapper[6932]: I0319 11:58:45.471564 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02e4c691-68ed-49f6-a8f6-c87579b65f07-var-lock" (OuterVolumeSpecName: "var-lock") pod "02e4c691-68ed-49f6-a8f6-c87579b65f07" (UID: "02e4c691-68ed-49f6-a8f6-c87579b65f07"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:58:45.472413 master-0 kubenswrapper[6932]: I0319 11:58:45.472347 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:45.472413 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:45.472413 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:45.472413 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:45.472639 master-0 kubenswrapper[6932]: I0319 11:58:45.472440 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:45.475240 master-0 kubenswrapper[6932]: I0319 11:58:45.475188 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02e4c691-68ed-49f6-a8f6-c87579b65f07-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "02e4c691-68ed-49f6-a8f6-c87579b65f07" (UID: "02e4c691-68ed-49f6-a8f6-c87579b65f07"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:58:45.568681 master-0 kubenswrapper[6932]: I0319 11:58:45.568617 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_02e4c691-68ed-49f6-a8f6-c87579b65f07/installer/0.log" Mar 19 11:58:45.568681 master-0 kubenswrapper[6932]: I0319 11:58:45.568669 6932 generic.go:334] "Generic (PLEG): container finished" podID="02e4c691-68ed-49f6-a8f6-c87579b65f07" containerID="6384c9ad839ac528dae2f57abae0c1588c98fa5a01e5dd526fb6a9b608bc80e3" exitCode=1 Mar 19 11:58:45.569149 master-0 kubenswrapper[6932]: I0319 11:58:45.568895 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"02e4c691-68ed-49f6-a8f6-c87579b65f07","Type":"ContainerDied","Data":"6384c9ad839ac528dae2f57abae0c1588c98fa5a01e5dd526fb6a9b608bc80e3"} Mar 19 11:58:45.569396 master-0 kubenswrapper[6932]: I0319 11:58:45.569352 6932 scope.go:117] "RemoveContainer" containerID="6384c9ad839ac528dae2f57abae0c1588c98fa5a01e5dd526fb6a9b608bc80e3" Mar 19 11:58:45.569624 master-0 kubenswrapper[6932]: I0319 11:58:45.569589 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 11:58:45.569767 master-0 kubenswrapper[6932]: I0319 11:58:45.568972 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"02e4c691-68ed-49f6-a8f6-c87579b65f07","Type":"ContainerDied","Data":"823cd400f46cb5e679e7cac9ded1fe167be8f0e08c4ab31db4dd5e9aa924bcd5"} Mar 19 11:58:45.571629 master-0 kubenswrapper[6932]: I0319 11:58:45.571587 6932 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/02e4c691-68ed-49f6-a8f6-c87579b65f07-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 11:58:45.571629 master-0 kubenswrapper[6932]: I0319 11:58:45.571604 6932 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/02e4c691-68ed-49f6-a8f6-c87579b65f07-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 11:58:45.571629 master-0 kubenswrapper[6932]: I0319 11:58:45.571615 6932 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/02e4c691-68ed-49f6-a8f6-c87579b65f07-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:58:45.590081 master-0 kubenswrapper[6932]: I0319 11:58:45.589989 6932 scope.go:117] "RemoveContainer" containerID="6384c9ad839ac528dae2f57abae0c1588c98fa5a01e5dd526fb6a9b608bc80e3" Mar 19 11:58:45.590793 master-0 kubenswrapper[6932]: E0319 11:58:45.590708 6932 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6384c9ad839ac528dae2f57abae0c1588c98fa5a01e5dd526fb6a9b608bc80e3\": container with ID starting with 6384c9ad839ac528dae2f57abae0c1588c98fa5a01e5dd526fb6a9b608bc80e3 not found: ID does not exist" containerID="6384c9ad839ac528dae2f57abae0c1588c98fa5a01e5dd526fb6a9b608bc80e3" Mar 19 11:58:45.590859 master-0 kubenswrapper[6932]: I0319 11:58:45.590783 6932 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6384c9ad839ac528dae2f57abae0c1588c98fa5a01e5dd526fb6a9b608bc80e3"} err="failed to get container status \"6384c9ad839ac528dae2f57abae0c1588c98fa5a01e5dd526fb6a9b608bc80e3\": rpc error: code = NotFound desc = could not find container \"6384c9ad839ac528dae2f57abae0c1588c98fa5a01e5dd526fb6a9b608bc80e3\": container with ID starting with 6384c9ad839ac528dae2f57abae0c1588c98fa5a01e5dd526fb6a9b608bc80e3 not found: ID does not exist" Mar 19 11:58:45.609783 master-0 kubenswrapper[6932]: I0319 11:58:45.609661 6932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=1.609641985 podStartE2EDuration="1.609641985s" podCreationTimestamp="2026-03-19 11:58:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:58:45.541469861 +0000 UTC m=+349.900530093" watchObservedRunningTime="2026-03-19 11:58:45.609641985 +0000 UTC m=+349.968702207" Mar 19 11:58:45.620700 master-0 kubenswrapper[6932]: I0319 11:58:45.620620 6932 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 19 11:58:45.626795 master-0 kubenswrapper[6932]: I0319 11:58:45.625542 6932 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 19 11:58:45.892159 master-0 kubenswrapper[6932]: I0319 11:58:45.886433 6932 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02e4c691-68ed-49f6-a8f6-c87579b65f07" path="/var/lib/kubelet/pods/02e4c691-68ed-49f6-a8f6-c87579b65f07/volumes" Mar 19 11:58:46.472893 master-0 kubenswrapper[6932]: I0319 11:58:46.472805 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:46.472893 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:46.472893 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:46.472893 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:46.473251 master-0 kubenswrapper[6932]: I0319 11:58:46.472930 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:47.472122 master-0 kubenswrapper[6932]: I0319 11:58:47.472037 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:47.472122 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:47.472122 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:47.472122 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:47.472902 master-0 kubenswrapper[6932]: I0319 11:58:47.472133 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:48.471661 master-0 kubenswrapper[6932]: I0319 11:58:48.471548 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:48.471661 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:48.471661 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:48.471661 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:48.472062 master-0 kubenswrapper[6932]: I0319 11:58:48.471694 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:49.472747 master-0 kubenswrapper[6932]: I0319 11:58:49.472623 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:49.472747 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:49.472747 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:49.472747 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:49.473559 master-0 kubenswrapper[6932]: I0319 11:58:49.472772 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:50.155450 master-0 kubenswrapper[6932]: E0319 11:58:50.155354 6932 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbfd5667_f6f4_4c7c_92b2_ea4ecd0f0103.slice/crio-1e03cf34918a9df69167cf80628d7425b9668e84a411e5ec9a6953baa6d085c1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbfd5667_f6f4_4c7c_92b2_ea4ecd0f0103.slice/crio-conmon-1e03cf34918a9df69167cf80628d7425b9668e84a411e5ec9a6953baa6d085c1.scope\": RecentStats: unable to find data in memory cache]" Mar 19 11:58:50.472333 master-0 kubenswrapper[6932]: I0319 11:58:50.472253 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:50.472333 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:50.472333 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:50.472333 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:50.472759 master-0 kubenswrapper[6932]: I0319 11:58:50.472361 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:51.471848 master-0 kubenswrapper[6932]: I0319 11:58:51.471768 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:51.471848 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:51.471848 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:51.471848 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:51.471848 master-0 kubenswrapper[6932]: I0319 11:58:51.471837 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:52.472409 master-0 kubenswrapper[6932]: I0319 11:58:52.472345 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:52.472409 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:52.472409 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:52.472409 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:52.473204 master-0 kubenswrapper[6932]: I0319 11:58:52.472416 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:53.471637 master-0 kubenswrapper[6932]: I0319 11:58:53.471575 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:53.471637 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:53.471637 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:53.471637 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:53.471994 master-0 kubenswrapper[6932]: I0319 11:58:53.471653 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:54.472588 master-0 kubenswrapper[6932]: I0319 11:58:54.472496 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:54.472588 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:54.472588 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:54.472588 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:54.473231 master-0 kubenswrapper[6932]: I0319 11:58:54.472640 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:55.473766 master-0 kubenswrapper[6932]: I0319 11:58:55.473671 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:55.473766 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:55.473766 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:55.473766 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:55.475142 master-0 kubenswrapper[6932]: I0319 11:58:55.473832 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:56.471740 master-0 kubenswrapper[6932]: I0319 11:58:56.471646 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:56.471740 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:56.471740 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:56.471740 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:56.472050 master-0 kubenswrapper[6932]: I0319 11:58:56.471747 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:57.472920 master-0 kubenswrapper[6932]: I0319 11:58:57.472858 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:57.472920 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:57.472920 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:57.472920 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:57.473538 master-0 kubenswrapper[6932]: I0319 11:58:57.472928 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:58.472018 master-0 kubenswrapper[6932]: I0319 11:58:58.471951 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:58.472018 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:58.472018 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:58.472018 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:58.472018 master-0 kubenswrapper[6932]: I0319 11:58:58.472020 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:58:59.472223 master-0 kubenswrapper[6932]: I0319 11:58:59.472147 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:58:59.472223 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:58:59.472223 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:58:59.472223 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:58:59.472223 master-0 kubenswrapper[6932]: I0319 11:58:59.472228 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:00.240321 master-0 kubenswrapper[6932]: I0319 11:59:00.240235 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:00.240321 master-0 kubenswrapper[6932]: I0319 11:59:00.240317 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:00.472189 master-0 kubenswrapper[6932]: I0319 11:59:00.472106 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:00.472189 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:00.472189 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:00.472189 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:00.473003 master-0 kubenswrapper[6932]: I0319 11:59:00.472216 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:01.471912 master-0 kubenswrapper[6932]: I0319 11:59:01.471855 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:01.471912 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:01.471912 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:01.471912 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:01.472255 master-0 kubenswrapper[6932]: I0319 11:59:01.471942 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:02.471698 master-0 kubenswrapper[6932]: I0319 11:59:02.471637 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:02.471698 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:02.471698 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:02.471698 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:02.472332 master-0 kubenswrapper[6932]: I0319 11:59:02.471710 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:03.472451 master-0 kubenswrapper[6932]: I0319 11:59:03.472361 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:03.472451 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:03.472451 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:03.472451 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:03.473438 master-0 kubenswrapper[6932]: I0319 11:59:03.472479 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:04.472208 master-0 kubenswrapper[6932]: I0319 11:59:04.472123 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:04.472208 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:04.472208 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:04.472208 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:04.472208 master-0 kubenswrapper[6932]: I0319 11:59:04.472211 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:05.472452 master-0 kubenswrapper[6932]: I0319 11:59:05.472363 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:05.472452 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:05.472452 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:05.472452 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:05.473775 master-0 kubenswrapper[6932]: I0319 11:59:05.472459 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:06.472120 master-0 kubenswrapper[6932]: I0319 11:59:06.472001 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:06.472120 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:06.472120 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:06.472120 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:06.472568 master-0 kubenswrapper[6932]: I0319 11:59:06.472160 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:07.471810 master-0 kubenswrapper[6932]: I0319 11:59:07.471713 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:07.471810 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:07.471810 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:07.471810 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:07.472386 master-0 kubenswrapper[6932]: I0319 11:59:07.471831 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:08.473213 master-0 kubenswrapper[6932]: I0319 11:59:08.473130 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:08.473213 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:08.473213 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:08.473213 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:08.473872 master-0 kubenswrapper[6932]: I0319 11:59:08.473273 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:09.472263 master-0 kubenswrapper[6932]: I0319 11:59:09.472167 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:09.472263 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:09.472263 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:09.472263 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:09.472263 master-0 kubenswrapper[6932]: I0319 11:59:09.472262 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:10.472202 master-0 kubenswrapper[6932]: I0319 11:59:10.472123 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:10.472202 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:10.472202 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:10.472202 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:10.472202 master-0 kubenswrapper[6932]: I0319 11:59:10.472190 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:11.472890 master-0 kubenswrapper[6932]: I0319 11:59:11.472761 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:11.472890 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:11.472890 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:11.472890 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:11.472890 master-0 kubenswrapper[6932]: I0319 11:59:11.472878 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:12.472178 master-0 kubenswrapper[6932]: I0319 11:59:12.472101 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:12.472178 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:12.472178 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:12.472178 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:12.472178 master-0 kubenswrapper[6932]: I0319 11:59:12.472173 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:13.472506 master-0 kubenswrapper[6932]: I0319 11:59:13.472403 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:13.472506 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:13.472506 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:13.472506 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:13.472506 master-0 kubenswrapper[6932]: I0319 11:59:13.472482 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:14.473235 master-0 kubenswrapper[6932]: I0319 11:59:14.473155 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:14.473235 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:14.473235 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:14.473235 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:14.473235 master-0 kubenswrapper[6932]: I0319 11:59:14.473226 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:15.471386 master-0 kubenswrapper[6932]: I0319 11:59:15.471315 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:15.471386 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:15.471386 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:15.471386 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:15.471386 master-0 kubenswrapper[6932]: I0319 11:59:15.471386 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:16.471917 master-0 kubenswrapper[6932]: I0319 11:59:16.471840 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:16.471917 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:16.471917 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:16.471917 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:16.472627 master-0 kubenswrapper[6932]: I0319 11:59:16.471939 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:17.472160 master-0 kubenswrapper[6932]: I0319 11:59:17.472096 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:17.472160 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:17.472160 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:17.472160 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:17.472883 master-0 kubenswrapper[6932]: I0319 11:59:17.472170 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:18.471911 master-0 kubenswrapper[6932]: I0319 11:59:18.471833 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:18.471911 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:18.471911 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:18.471911 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:18.472206 master-0 kubenswrapper[6932]: I0319 11:59:18.471910 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:19.474196 master-0 kubenswrapper[6932]: I0319 11:59:19.474026 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:19.474196 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:19.474196 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:19.474196 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:19.475295 master-0 kubenswrapper[6932]: I0319 11:59:19.474276 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:20.248463 master-0 kubenswrapper[6932]: I0319 11:59:20.248410 6932 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:20.253919 master-0 kubenswrapper[6932]: I0319 11:59:20.253828 6932 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:20.471693 master-0 kubenswrapper[6932]: I0319 11:59:20.471619 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:20.471693 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:20.471693 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:20.471693 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:20.472110 master-0 kubenswrapper[6932]: I0319 11:59:20.471701 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:21.471846 master-0 kubenswrapper[6932]: I0319 11:59:21.471776 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:21.471846 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:21.471846 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:21.471846 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:21.472456 master-0 kubenswrapper[6932]: I0319 11:59:21.471856 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:22.472203 master-0 kubenswrapper[6932]: I0319 11:59:22.472063 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:22.472203 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:22.472203 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:22.472203 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:22.472203 master-0 kubenswrapper[6932]: I0319 11:59:22.472178 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:22.768860 master-0 kubenswrapper[6932]: I0319 11:59:22.768688 6932 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 11:59:22.769219 master-0 kubenswrapper[6932]: E0319 11:59:22.769152 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02e4c691-68ed-49f6-a8f6-c87579b65f07" containerName="installer" Mar 19 11:59:22.769219 master-0 kubenswrapper[6932]: I0319 11:59:22.769182 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e4c691-68ed-49f6-a8f6-c87579b65f07" containerName="installer" Mar 19 11:59:22.769423 master-0 kubenswrapper[6932]: I0319 11:59:22.769389 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="02e4c691-68ed-49f6-a8f6-c87579b65f07" containerName="installer" Mar 19 11:59:22.770072 master-0 kubenswrapper[6932]: I0319 11:59:22.770029 6932 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 19 11:59:22.770426 master-0 kubenswrapper[6932]: I0319 11:59:22.770377 6932 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" containerID="cri-o://21c17e15f1723f8eb75ec60f42ebd73c793697e640249886764928c881dbaaa1" gracePeriod=15 Mar 19 11:59:22.770689 master-0 kubenswrapper[6932]: I0319 11:59:22.770654 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:22.771278 master-0 kubenswrapper[6932]: I0319 11:59:22.771234 6932 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://39c756c5e9204811d8c83cfa45ff7447029413f92b87a61b82da1dc41e1a076d" gracePeriod=15 Mar 19 11:59:22.772548 master-0 kubenswrapper[6932]: I0319 11:59:22.772517 6932 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 11:59:22.772839 master-0 kubenswrapper[6932]: E0319 11:59:22.772804 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fac1b46a11e49501805e891baae4a9" containerName="setup" Mar 19 11:59:22.772839 master-0 kubenswrapper[6932]: I0319 11:59:22.772834 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fac1b46a11e49501805e891baae4a9" containerName="setup" Mar 19 11:59:22.773401 master-0 kubenswrapper[6932]: E0319 11:59:22.772854 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" Mar 19 11:59:22.773401 master-0 kubenswrapper[6932]: I0319 11:59:22.772868 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" Mar 19 11:59:22.773401 master-0 kubenswrapper[6932]: E0319 11:59:22.772905 6932 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" Mar 19 11:59:22.773401 master-0 kubenswrapper[6932]: I0319 11:59:22.772917 6932 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" Mar 19 11:59:22.773401 master-0 kubenswrapper[6932]: I0319 11:59:22.773055 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" Mar 19 11:59:22.773401 master-0 kubenswrapper[6932]: I0319 11:59:22.773073 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" Mar 19 11:59:22.773401 master-0 kubenswrapper[6932]: I0319 11:59:22.773086 6932 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fac1b46a11e49501805e891baae4a9" containerName="setup" Mar 19 11:59:22.778356 master-0 kubenswrapper[6932]: I0319 11:59:22.778315 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:22.789415 master-0 kubenswrapper[6932]: I0319 11:59:22.789364 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:22.789594 master-0 kubenswrapper[6932]: I0319 11:59:22.789420 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:22.789594 master-0 kubenswrapper[6932]: I0319 11:59:22.789463 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:22.789594 master-0 kubenswrapper[6932]: I0319 11:59:22.789484 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:22.789594 master-0 kubenswrapper[6932]: I0319 11:59:22.789514 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:22.789594 master-0 kubenswrapper[6932]: I0319 11:59:22.789538 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:22.789594 master-0 kubenswrapper[6932]: I0319 11:59:22.789566 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:22.789594 master-0 kubenswrapper[6932]: I0319 11:59:22.789598 6932 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:22.811751 master-0 kubenswrapper[6932]: I0319 11:59:22.811670 6932 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 11:59:22.842834 master-0 kubenswrapper[6932]: E0319 11:59:22.842539 6932 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:22.892627 master-0 kubenswrapper[6932]: I0319 11:59:22.892529 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:22.892885 master-0 kubenswrapper[6932]: I0319 11:59:22.892654 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:22.892929 master-0 kubenswrapper[6932]: I0319 11:59:22.892862 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:22.893024 master-0 kubenswrapper[6932]: I0319 11:59:22.892979 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:22.893143 master-0 kubenswrapper[6932]: I0319 11:59:22.893117 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:22.893180 master-0 kubenswrapper[6932]: I0319 11:59:22.893155 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:22.893265 master-0 kubenswrapper[6932]: I0319 11:59:22.893152 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:22.893305 master-0 kubenswrapper[6932]: I0319 11:59:22.893275 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:22.893305 master-0 kubenswrapper[6932]: I0319 11:59:22.893207 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:22.893369 master-0 kubenswrapper[6932]: I0319 11:59:22.893310 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:22.893625 master-0 kubenswrapper[6932]: I0319 11:59:22.893409 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:22.893625 master-0 kubenswrapper[6932]: I0319 11:59:22.893484 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:22.893625 master-0 kubenswrapper[6932]: I0319 11:59:22.893547 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:22.893764 master-0 kubenswrapper[6932]: I0319 11:59:22.893711 6932 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:22.893797 master-0 kubenswrapper[6932]: I0319 11:59:22.893779 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:22.893914 master-0 kubenswrapper[6932]: I0319 11:59:22.893393 6932 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:22.991716 master-0 kubenswrapper[6932]: I0319 11:59:22.991657 6932 patch_prober.go:28] interesting pod/bootstrap-kube-apiserver-master-0 container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.32.10:6443/readyz\": dial tcp 192.168.32.10:6443: connect: connection refused" start-of-body= Mar 19 11:59:22.992524 master-0 kubenswrapper[6932]: I0319 11:59:22.991807 6932 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.32.10:6443/readyz\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 11:59:22.993782 master-0 kubenswrapper[6932]: E0319 11:59:22.993014 6932 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event=< Mar 19 11:59:22.993782 master-0 kubenswrapper[6932]: &Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3c434d86ef46 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.32.10:6443/readyz": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:59:22.993782 master-0 kubenswrapper[6932]: body: Mar 19 11:59:22.993782 master-0 kubenswrapper[6932]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:59:22.991718214 +0000 UTC m=+387.350778436,LastTimestamp:2026-03-19 11:59:22.991718214 +0000 UTC m=+387.350778436,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Mar 19 11:59:22.993782 master-0 kubenswrapper[6932]: > Mar 19 11:59:23.106612 master-0 kubenswrapper[6932]: I0319 11:59:23.106362 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:23.125949 master-0 kubenswrapper[6932]: W0319 11:59:23.125888 6932 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e7a82869988463543d3d8dd1f0b5fe3.slice/crio-c238dcb10339e469e019f35f43263a486da7ad20431c7557165dd244d72db205 WatchSource:0}: Error finding container c238dcb10339e469e019f35f43263a486da7ad20431c7557165dd244d72db205: Status 404 returned error can't find the container with id c238dcb10339e469e019f35f43263a486da7ad20431c7557165dd244d72db205 Mar 19 11:59:23.144120 master-0 kubenswrapper[6932]: I0319 11:59:23.144083 6932 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:23.472515 master-0 kubenswrapper[6932]: I0319 11:59:23.472457 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:23.472515 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:23.472515 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:23.472515 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:23.473195 master-0 kubenswrapper[6932]: I0319 11:59:23.472522 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:23.757028 master-0 kubenswrapper[6932]: E0319 11:59:23.756872 6932 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event=< Mar 19 11:59:23.757028 master-0 kubenswrapper[6932]: &Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3c434d86ef46 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.32.10:6443/readyz": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:59:23.757028 master-0 kubenswrapper[6932]: body: Mar 19 11:59:23.757028 master-0 kubenswrapper[6932]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:59:22.991718214 +0000 UTC m=+387.350778436,LastTimestamp:2026-03-19 11:59:22.991718214 +0000 UTC m=+387.350778436,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Mar 19 11:59:23.757028 master-0 kubenswrapper[6932]: > Mar 19 11:59:23.815909 master-0 kubenswrapper[6932]: I0319 11:59:23.815851 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-qrjj4_163d6a3d-0080-4122-bb7a-17f6e63f00f0/ingress-operator/1.log" Mar 19 11:59:23.822130 master-0 kubenswrapper[6932]: I0319 11:59:23.820551 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-qrjj4_163d6a3d-0080-4122-bb7a-17f6e63f00f0/ingress-operator/0.log" Mar 19 11:59:23.822130 master-0 kubenswrapper[6932]: I0319 11:59:23.820626 6932 generic.go:334] "Generic (PLEG): container finished" podID="163d6a3d-0080-4122-bb7a-17f6e63f00f0" containerID="ceffe432bb3380aafe0729954185b3652b99ca21e97ac6c1e688d47217f36148" exitCode=1 Mar 19 11:59:23.822130 master-0 kubenswrapper[6932]: I0319 11:59:23.820752 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" event={"ID":"163d6a3d-0080-4122-bb7a-17f6e63f00f0","Type":"ContainerDied","Data":"ceffe432bb3380aafe0729954185b3652b99ca21e97ac6c1e688d47217f36148"} Mar 19 11:59:23.822130 master-0 kubenswrapper[6932]: I0319 11:59:23.820793 6932 scope.go:117] "RemoveContainer" containerID="a5a674d7299c49bd88f1c56fca174966ef4c28920edc64023b6ce41812e041c8" Mar 19 11:59:23.822130 master-0 kubenswrapper[6932]: I0319 11:59:23.821426 6932 scope.go:117] "RemoveContainer" containerID="ceffe432bb3380aafe0729954185b3652b99ca21e97ac6c1e688d47217f36148" Mar 19 11:59:23.822130 master-0 kubenswrapper[6932]: E0319 11:59:23.821642 6932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ingress-operator pod=ingress-operator-66b84d69b-qrjj4_openshift-ingress-operator(163d6a3d-0080-4122-bb7a-17f6e63f00f0)\"" pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" podUID="163d6a3d-0080-4122-bb7a-17f6e63f00f0" Mar 19 11:59:23.823017 master-0 kubenswrapper[6932]: I0319 11:59:23.822778 6932 status_manager.go:851] "Failed to get status for pod" podUID="163d6a3d-0080-4122-bb7a-17f6e63f00f0" pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-ingress-operator/pods/ingress-operator-66b84d69b-qrjj4\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 11:59:23.823608 master-0 kubenswrapper[6932]: I0319 11:59:23.823517 6932 status_manager.go:851] "Failed to get status for pod" podUID="8e7a82869988463543d3d8dd1f0b5fe3" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 11:59:23.825489 master-0 kubenswrapper[6932]: I0319 11:59:23.825451 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"8e7a82869988463543d3d8dd1f0b5fe3","Type":"ContainerStarted","Data":"9b6257f6f7778f2a35b2ae1acf1c50824333ce2495da482e1b0a8f990b61871a"} Mar 19 11:59:23.825563 master-0 kubenswrapper[6932]: I0319 11:59:23.825499 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"8e7a82869988463543d3d8dd1f0b5fe3","Type":"ContainerStarted","Data":"c238dcb10339e469e019f35f43263a486da7ad20431c7557165dd244d72db205"} Mar 19 11:59:23.826397 master-0 kubenswrapper[6932]: I0319 11:59:23.826349 6932 status_manager.go:851] "Failed to get status for pod" podUID="163d6a3d-0080-4122-bb7a-17f6e63f00f0" pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-ingress-operator/pods/ingress-operator-66b84d69b-qrjj4\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 11:59:23.827163 master-0 kubenswrapper[6932]: I0319 11:59:23.827130 6932 status_manager.go:851] "Failed to get status for pod" podUID="8e7a82869988463543d3d8dd1f0b5fe3" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 11:59:23.827817 master-0 kubenswrapper[6932]: I0319 11:59:23.827792 6932 generic.go:334] "Generic (PLEG): container finished" podID="1c576a88-6da4-43e9-a373-0df27a029f59" containerID="ddc94e7a85827e965bf13353b20a1293018b59883ea4cdbc55de2c9639ca8732" exitCode=0 Mar 19 11:59:23.827887 master-0 kubenswrapper[6932]: I0319 11:59:23.827865 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"1c576a88-6da4-43e9-a373-0df27a029f59","Type":"ContainerDied","Data":"ddc94e7a85827e965bf13353b20a1293018b59883ea4cdbc55de2c9639ca8732"} Mar 19 11:59:23.828553 master-0 kubenswrapper[6932]: I0319 11:59:23.828517 6932 status_manager.go:851] "Failed to get status for pod" podUID="163d6a3d-0080-4122-bb7a-17f6e63f00f0" pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-ingress-operator/pods/ingress-operator-66b84d69b-qrjj4\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 11:59:23.829854 master-0 kubenswrapper[6932]: I0319 11:59:23.829818 6932 status_manager.go:851] "Failed to get status for pod" podUID="8e7a82869988463543d3d8dd1f0b5fe3" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 11:59:23.830369 master-0 kubenswrapper[6932]: I0319 11:59:23.830335 6932 status_manager.go:851] "Failed to get status for pod" podUID="1c576a88-6da4-43e9-a373-0df27a029f59" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 11:59:23.835187 master-0 kubenswrapper[6932]: I0319 11:59:23.835157 6932 generic.go:334] "Generic (PLEG): container finished" podID="49fac1b46a11e49501805e891baae4a9" containerID="39c756c5e9204811d8c83cfa45ff7447029413f92b87a61b82da1dc41e1a076d" exitCode=0 Mar 19 11:59:23.837326 master-0 kubenswrapper[6932]: I0319 11:59:23.837302 6932 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="a5f86fdf43005285d71f6c8884db1a78fa394b1a17074bd7f8a4187de0fcd0ff" exitCode=0 Mar 19 11:59:23.837414 master-0 kubenswrapper[6932]: I0319 11:59:23.837334 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerDied","Data":"a5f86fdf43005285d71f6c8884db1a78fa394b1a17074bd7f8a4187de0fcd0ff"} Mar 19 11:59:23.837414 master-0 kubenswrapper[6932]: I0319 11:59:23.837362 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"e78919d3ec5c9e1fc04085900a692953e2087a6d624466d667eb24bc45d8ddb6"} Mar 19 11:59:23.838740 master-0 kubenswrapper[6932]: I0319 11:59:23.838700 6932 status_manager.go:851] "Failed to get status for pod" podUID="163d6a3d-0080-4122-bb7a-17f6e63f00f0" pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-ingress-operator/pods/ingress-operator-66b84d69b-qrjj4\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 11:59:23.838819 master-0 kubenswrapper[6932]: E0319 11:59:23.838739 6932 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:23.839531 master-0 kubenswrapper[6932]: I0319 11:59:23.839437 6932 status_manager.go:851] "Failed to get status for pod" podUID="8e7a82869988463543d3d8dd1f0b5fe3" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 11:59:23.840271 master-0 kubenswrapper[6932]: I0319 11:59:23.840234 6932 status_manager.go:851] "Failed to get status for pod" podUID="1c576a88-6da4-43e9-a373-0df27a029f59" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 11:59:24.473032 master-0 kubenswrapper[6932]: I0319 11:59:24.472908 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:24.473032 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:24.473032 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:24.473032 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:24.473032 master-0 kubenswrapper[6932]: I0319 11:59:24.472986 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:24.995510 master-0 kubenswrapper[6932]: I0319 11:59:24.992323 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"f96d175f0cd36aaf469f89a28d2a0993ca551b8d590ac0d9e0b11a56d879ec29"} Mar 19 11:59:24.995510 master-0 kubenswrapper[6932]: I0319 11:59:24.992389 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"f9eb29fd4fd09864a6d14d4e4d10b2022ffb83f13ece47bbceaba5b7bd3c3dd4"} Mar 19 11:59:24.995510 master-0 kubenswrapper[6932]: I0319 11:59:24.992404 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"2a13852b86c512a96024f86ef51091c77de4129071d7f100f1b56772f75c4778"} Mar 19 11:59:24.995510 master-0 kubenswrapper[6932]: I0319 11:59:24.992418 6932 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"991feed2291ac9a84bafc878cdc07f7aa3c0c5e50e56fe23c94905ee545d3fbd"} Mar 19 11:59:25.007758 master-0 kubenswrapper[6932]: I0319 11:59:25.004868 6932 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-qrjj4_163d6a3d-0080-4122-bb7a-17f6e63f00f0/ingress-operator/1.log" Mar 19 11:59:25.187467 master-0 kubenswrapper[6932]: I0319 11:59:25.187327 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:59:25.356123 master-0 kubenswrapper[6932]: I0319 11:59:25.356081 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:59:25.382592 master-0 kubenswrapper[6932]: I0319 11:59:25.381803 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 19 11:59:25.382592 master-0 kubenswrapper[6932]: I0319 11:59:25.381919 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 19 11:59:25.382592 master-0 kubenswrapper[6932]: I0319 11:59:25.381946 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:59:25.382592 master-0 kubenswrapper[6932]: I0319 11:59:25.382069 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 19 11:59:25.382592 master-0 kubenswrapper[6932]: I0319 11:59:25.382069 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:59:25.382592 master-0 kubenswrapper[6932]: I0319 11:59:25.382127 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 19 11:59:25.382592 master-0 kubenswrapper[6932]: I0319 11:59:25.382160 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 19 11:59:25.382592 master-0 kubenswrapper[6932]: I0319 11:59:25.382181 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 19 11:59:25.382592 master-0 kubenswrapper[6932]: I0319 11:59:25.382094 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets" (OuterVolumeSpecName: "secrets") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:59:25.382592 master-0 kubenswrapper[6932]: I0319 11:59:25.382215 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs" (OuterVolumeSpecName: "logs") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:59:25.382592 master-0 kubenswrapper[6932]: I0319 11:59:25.382220 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config" (OuterVolumeSpecName: "config") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:59:25.382592 master-0 kubenswrapper[6932]: I0319 11:59:25.382312 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:59:25.382592 master-0 kubenswrapper[6932]: I0319 11:59:25.382504 6932 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:59:25.382592 master-0 kubenswrapper[6932]: I0319 11:59:25.382520 6932 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") on node \"master-0\" DevicePath \"\"" Mar 19 11:59:25.382592 master-0 kubenswrapper[6932]: I0319 11:59:25.382532 6932 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 11:59:25.382592 master-0 kubenswrapper[6932]: I0319 11:59:25.382543 6932 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:59:25.382592 master-0 kubenswrapper[6932]: I0319 11:59:25.382555 6932 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Mar 19 11:59:25.382592 master-0 kubenswrapper[6932]: I0319 11:59:25.382567 6932 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Mar 19 11:59:25.472095 master-0 kubenswrapper[6932]: I0319 11:59:25.472032 6932 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-kpmgt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:25.472095 master-0 kubenswrapper[6932]: [-]has-synced failed: reason withheld Mar 19 11:59:25.472095 master-0 kubenswrapper[6932]: [+]process-running ok Mar 19 11:59:25.472095 master-0 kubenswrapper[6932]: healthz check failed Mar 19 11:59:25.472424 master-0 kubenswrapper[6932]: I0319 11:59:25.472110 6932 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" podUID="e2ad29ad-70ef-43c6-91f6-02f04d145673" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:25.483644 master-0 kubenswrapper[6932]: I0319 11:59:25.483585 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c576a88-6da4-43e9-a373-0df27a029f59-kubelet-dir\") pod \"1c576a88-6da4-43e9-a373-0df27a029f59\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " Mar 19 11:59:25.484347 master-0 kubenswrapper[6932]: I0319 11:59:25.484326 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c576a88-6da4-43e9-a373-0df27a029f59-var-lock\") pod \"1c576a88-6da4-43e9-a373-0df27a029f59\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " Mar 19 11:59:25.484471 master-0 kubenswrapper[6932]: I0319 11:59:25.484459 6932 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access\") pod \"1c576a88-6da4-43e9-a373-0df27a029f59\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " Mar 19 11:59:25.484666 master-0 kubenswrapper[6932]: I0319 11:59:25.483752 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c576a88-6da4-43e9-a373-0df27a029f59-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1c576a88-6da4-43e9-a373-0df27a029f59" (UID: "1c576a88-6da4-43e9-a373-0df27a029f59"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:59:25.484744 master-0 kubenswrapper[6932]: I0319 11:59:25.484392 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c576a88-6da4-43e9-a373-0df27a029f59-var-lock" (OuterVolumeSpecName: "var-lock") pod "1c576a88-6da4-43e9-a373-0df27a029f59" (UID: "1c576a88-6da4-43e9-a373-0df27a029f59"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:59:25.485061 master-0 kubenswrapper[6932]: I0319 11:59:25.485041 6932 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c576a88-6da4-43e9-a373-0df27a029f59-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:59:25.485146 master-0 kubenswrapper[6932]: I0319 11:59:25.485134 6932 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c576a88-6da4-43e9-a373-0df27a029f59-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 11:59:25.488910 master-0 kubenswrapper[6932]: I0319 11:59:25.488874 6932 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1c576a88-6da4-43e9-a373-0df27a029f59" (UID: "1c576a88-6da4-43e9-a373-0df27a029f59"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:59:25.589010 master-0 kubenswrapper[6932]: I0319 11:59:25.587957 6932 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 11:59:25.947929 master-0 kubenswrapper[6932]: I0319 11:59:25.947863 6932 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49fac1b46a11e49501805e891baae4a9" path="/var/lib/kubelet/pods/49fac1b46a11e49501805e891baae4a9/volumes" Mar 19 11:59:25.948328 master-0 kubenswrapper[6932]: I0319 11:59:25.948306 6932 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 19 11:59:26.037272 master-0 kubenswrapper[6932]: I0319 11:59:26.036159 6932 generic.go:334] "Generic (PLEG): container finished" podID="49fac1b46a11e49501805e891baae4a9" containerID="21c17e15f1723f8eb75ec60f42ebd73c793697e640249886764928c881dbaaa1" exitCode=0 Mar 19 11:59:26.037272 master-0 kubenswrapper[6932]: I0319 11:59:26.036299 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:59:26.041931 master-0 kubenswrapper[6932]: I0319 11:59:26.041215 6932 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:59:26.095373 master-0 systemd[1]: Stopping Kubernetes Kubelet... Mar 19 11:59:26.152488 master-0 systemd[1]: kubelet.service: Deactivated successfully. Mar 19 11:59:26.152849 master-0 systemd[1]: Stopped Kubernetes Kubelet. Mar 19 11:59:26.154363 master-0 systemd[1]: kubelet.service: Consumed 47.195s CPU time. Mar 19 11:59:26.169670 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 19 11:59:26.294638 master-0 kubenswrapper[17644]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 11:59:26.294638 master-0 kubenswrapper[17644]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 19 11:59:26.294638 master-0 kubenswrapper[17644]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 11:59:26.294638 master-0 kubenswrapper[17644]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 11:59:26.294638 master-0 kubenswrapper[17644]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 19 11:59:26.294638 master-0 kubenswrapper[17644]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 11:59:26.295361 master-0 kubenswrapper[17644]: I0319 11:59:26.294758 17644 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 19 11:59:26.300249 master-0 kubenswrapper[17644]: W0319 11:59:26.298042 17644 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 11:59:26.300249 master-0 kubenswrapper[17644]: W0319 11:59:26.298086 17644 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 11:59:26.300249 master-0 kubenswrapper[17644]: W0319 11:59:26.298092 17644 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 11:59:26.300249 master-0 kubenswrapper[17644]: W0319 11:59:26.298097 17644 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 11:59:26.300249 master-0 kubenswrapper[17644]: W0319 11:59:26.298103 17644 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 11:59:26.300249 master-0 kubenswrapper[17644]: W0319 11:59:26.298109 17644 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 11:59:26.300249 master-0 kubenswrapper[17644]: W0319 11:59:26.298122 17644 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 11:59:26.300249 master-0 kubenswrapper[17644]: W0319 11:59:26.298127 17644 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 11:59:26.300249 master-0 kubenswrapper[17644]: W0319 11:59:26.298132 17644 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 11:59:26.300249 master-0 kubenswrapper[17644]: W0319 11:59:26.298137 17644 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 11:59:26.300249 master-0 kubenswrapper[17644]: W0319 11:59:26.298142 17644 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 11:59:26.300249 master-0 kubenswrapper[17644]: W0319 11:59:26.298147 17644 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 11:59:26.300249 master-0 kubenswrapper[17644]: W0319 11:59:26.298154 17644 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 11:59:26.300249 master-0 kubenswrapper[17644]: W0319 11:59:26.298161 17644 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 11:59:26.300249 master-0 kubenswrapper[17644]: W0319 11:59:26.298167 17644 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 11:59:26.300249 master-0 kubenswrapper[17644]: W0319 11:59:26.298172 17644 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 11:59:26.300249 master-0 kubenswrapper[17644]: W0319 11:59:26.298176 17644 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 11:59:26.300249 master-0 kubenswrapper[17644]: W0319 11:59:26.298181 17644 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 11:59:26.300249 master-0 kubenswrapper[17644]: W0319 11:59:26.298186 17644 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 11:59:26.300889 master-0 kubenswrapper[17644]: W0319 11:59:26.298192 17644 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 11:59:26.300889 master-0 kubenswrapper[17644]: W0319 11:59:26.298197 17644 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 11:59:26.300889 master-0 kubenswrapper[17644]: W0319 11:59:26.298203 17644 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 11:59:26.300889 master-0 kubenswrapper[17644]: W0319 11:59:26.298228 17644 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 11:59:26.300889 master-0 kubenswrapper[17644]: W0319 11:59:26.298235 17644 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 11:59:26.300889 master-0 kubenswrapper[17644]: W0319 11:59:26.298239 17644 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 11:59:26.300889 master-0 kubenswrapper[17644]: W0319 11:59:26.298244 17644 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 11:59:26.300889 master-0 kubenswrapper[17644]: W0319 11:59:26.298250 17644 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 11:59:26.300889 master-0 kubenswrapper[17644]: W0319 11:59:26.298255 17644 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 11:59:26.300889 master-0 kubenswrapper[17644]: W0319 11:59:26.298262 17644 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 11:59:26.300889 master-0 kubenswrapper[17644]: W0319 11:59:26.298269 17644 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 11:59:26.300889 master-0 kubenswrapper[17644]: W0319 11:59:26.298276 17644 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 11:59:26.300889 master-0 kubenswrapper[17644]: W0319 11:59:26.298282 17644 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 11:59:26.300889 master-0 kubenswrapper[17644]: W0319 11:59:26.298287 17644 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 11:59:26.300889 master-0 kubenswrapper[17644]: W0319 11:59:26.298291 17644 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 11:59:26.300889 master-0 kubenswrapper[17644]: W0319 11:59:26.298297 17644 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 11:59:26.300889 master-0 kubenswrapper[17644]: W0319 11:59:26.298301 17644 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 11:59:26.300889 master-0 kubenswrapper[17644]: W0319 11:59:26.298306 17644 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 11:59:26.300889 master-0 kubenswrapper[17644]: W0319 11:59:26.298310 17644 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 11:59:26.301408 master-0 kubenswrapper[17644]: W0319 11:59:26.298315 17644 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 11:59:26.301408 master-0 kubenswrapper[17644]: W0319 11:59:26.298319 17644 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 11:59:26.301408 master-0 kubenswrapper[17644]: W0319 11:59:26.298324 17644 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 11:59:26.301408 master-0 kubenswrapper[17644]: W0319 11:59:26.298328 17644 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 11:59:26.301408 master-0 kubenswrapper[17644]: W0319 11:59:26.298331 17644 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 11:59:26.301408 master-0 kubenswrapper[17644]: W0319 11:59:26.298335 17644 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 11:59:26.301408 master-0 kubenswrapper[17644]: W0319 11:59:26.298338 17644 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 11:59:26.301408 master-0 kubenswrapper[17644]: W0319 11:59:26.298342 17644 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 11:59:26.301408 master-0 kubenswrapper[17644]: W0319 11:59:26.298346 17644 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 11:59:26.301408 master-0 kubenswrapper[17644]: W0319 11:59:26.298349 17644 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 11:59:26.301408 master-0 kubenswrapper[17644]: W0319 11:59:26.298353 17644 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 11:59:26.301408 master-0 kubenswrapper[17644]: W0319 11:59:26.298357 17644 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 11:59:26.301408 master-0 kubenswrapper[17644]: W0319 11:59:26.298361 17644 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 11:59:26.301408 master-0 kubenswrapper[17644]: W0319 11:59:26.298364 17644 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 11:59:26.301408 master-0 kubenswrapper[17644]: W0319 11:59:26.298368 17644 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 11:59:26.301408 master-0 kubenswrapper[17644]: W0319 11:59:26.298372 17644 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 11:59:26.301408 master-0 kubenswrapper[17644]: W0319 11:59:26.298377 17644 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 11:59:26.301408 master-0 kubenswrapper[17644]: W0319 11:59:26.298382 17644 feature_gate.go:330] unrecognized feature gate: Example Mar 19 11:59:26.301408 master-0 kubenswrapper[17644]: W0319 11:59:26.298387 17644 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 11:59:26.301408 master-0 kubenswrapper[17644]: W0319 11:59:26.298390 17644 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 11:59:26.301408 master-0 kubenswrapper[17644]: W0319 11:59:26.298394 17644 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 11:59:26.302977 master-0 kubenswrapper[17644]: W0319 11:59:26.298398 17644 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 11:59:26.302977 master-0 kubenswrapper[17644]: W0319 11:59:26.298407 17644 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 11:59:26.302977 master-0 kubenswrapper[17644]: W0319 11:59:26.298410 17644 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 11:59:26.302977 master-0 kubenswrapper[17644]: W0319 11:59:26.298414 17644 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 11:59:26.302977 master-0 kubenswrapper[17644]: W0319 11:59:26.298417 17644 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 11:59:26.302977 master-0 kubenswrapper[17644]: W0319 11:59:26.298421 17644 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 11:59:26.302977 master-0 kubenswrapper[17644]: W0319 11:59:26.298425 17644 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 11:59:26.302977 master-0 kubenswrapper[17644]: W0319 11:59:26.298428 17644 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 11:59:26.302977 master-0 kubenswrapper[17644]: W0319 11:59:26.298432 17644 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 11:59:26.302977 master-0 kubenswrapper[17644]: W0319 11:59:26.298435 17644 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 11:59:26.302977 master-0 kubenswrapper[17644]: W0319 11:59:26.298438 17644 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 11:59:26.302977 master-0 kubenswrapper[17644]: W0319 11:59:26.298442 17644 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 11:59:26.302977 master-0 kubenswrapper[17644]: W0319 11:59:26.298446 17644 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 11:59:26.302977 master-0 kubenswrapper[17644]: I0319 11:59:26.298544 17644 flags.go:64] FLAG: --address="0.0.0.0" Mar 19 11:59:26.302977 master-0 kubenswrapper[17644]: I0319 11:59:26.298555 17644 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 19 11:59:26.302977 master-0 kubenswrapper[17644]: I0319 11:59:26.298591 17644 flags.go:64] FLAG: --anonymous-auth="true" Mar 19 11:59:26.302977 master-0 kubenswrapper[17644]: I0319 11:59:26.298598 17644 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 19 11:59:26.302977 master-0 kubenswrapper[17644]: I0319 11:59:26.298604 17644 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 19 11:59:26.302977 master-0 kubenswrapper[17644]: I0319 11:59:26.298608 17644 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 19 11:59:26.302977 master-0 kubenswrapper[17644]: I0319 11:59:26.298614 17644 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 19 11:59:26.302977 master-0 kubenswrapper[17644]: I0319 11:59:26.298619 17644 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 19 11:59:26.304134 master-0 kubenswrapper[17644]: I0319 11:59:26.298623 17644 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 19 11:59:26.304134 master-0 kubenswrapper[17644]: I0319 11:59:26.298628 17644 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 19 11:59:26.304134 master-0 kubenswrapper[17644]: I0319 11:59:26.298632 17644 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 19 11:59:26.304134 master-0 kubenswrapper[17644]: I0319 11:59:26.298637 17644 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 19 11:59:26.304134 master-0 kubenswrapper[17644]: I0319 11:59:26.298642 17644 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 19 11:59:26.304134 master-0 kubenswrapper[17644]: I0319 11:59:26.298647 17644 flags.go:64] FLAG: --cgroup-root="" Mar 19 11:59:26.304134 master-0 kubenswrapper[17644]: I0319 11:59:26.298653 17644 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 19 11:59:26.304134 master-0 kubenswrapper[17644]: I0319 11:59:26.298658 17644 flags.go:64] FLAG: --client-ca-file="" Mar 19 11:59:26.304134 master-0 kubenswrapper[17644]: I0319 11:59:26.298663 17644 flags.go:64] FLAG: --cloud-config="" Mar 19 11:59:26.304134 master-0 kubenswrapper[17644]: I0319 11:59:26.298668 17644 flags.go:64] FLAG: --cloud-provider="" Mar 19 11:59:26.304134 master-0 kubenswrapper[17644]: I0319 11:59:26.298674 17644 flags.go:64] FLAG: --cluster-dns="[]" Mar 19 11:59:26.304134 master-0 kubenswrapper[17644]: I0319 11:59:26.298681 17644 flags.go:64] FLAG: --cluster-domain="" Mar 19 11:59:26.304134 master-0 kubenswrapper[17644]: I0319 11:59:26.298688 17644 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 19 11:59:26.304134 master-0 kubenswrapper[17644]: I0319 11:59:26.298692 17644 flags.go:64] FLAG: --config-dir="" Mar 19 11:59:26.304134 master-0 kubenswrapper[17644]: I0319 11:59:26.298696 17644 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 19 11:59:26.304134 master-0 kubenswrapper[17644]: I0319 11:59:26.298701 17644 flags.go:64] FLAG: --container-log-max-files="5" Mar 19 11:59:26.304134 master-0 kubenswrapper[17644]: I0319 11:59:26.298707 17644 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 19 11:59:26.304134 master-0 kubenswrapper[17644]: I0319 11:59:26.298711 17644 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 19 11:59:26.304134 master-0 kubenswrapper[17644]: I0319 11:59:26.298715 17644 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 19 11:59:26.304134 master-0 kubenswrapper[17644]: I0319 11:59:26.298720 17644 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 19 11:59:26.304134 master-0 kubenswrapper[17644]: I0319 11:59:26.298743 17644 flags.go:64] FLAG: --contention-profiling="false" Mar 19 11:59:26.304134 master-0 kubenswrapper[17644]: I0319 11:59:26.298749 17644 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 19 11:59:26.304134 master-0 kubenswrapper[17644]: I0319 11:59:26.298754 17644 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 19 11:59:26.304134 master-0 kubenswrapper[17644]: I0319 11:59:26.298760 17644 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 19 11:59:26.304134 master-0 kubenswrapper[17644]: I0319 11:59:26.298765 17644 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 19 11:59:26.304812 master-0 kubenswrapper[17644]: I0319 11:59:26.298774 17644 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 19 11:59:26.304812 master-0 kubenswrapper[17644]: I0319 11:59:26.298780 17644 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 19 11:59:26.304812 master-0 kubenswrapper[17644]: I0319 11:59:26.298785 17644 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 19 11:59:26.304812 master-0 kubenswrapper[17644]: I0319 11:59:26.298790 17644 flags.go:64] FLAG: --enable-load-reader="false" Mar 19 11:59:26.304812 master-0 kubenswrapper[17644]: I0319 11:59:26.298795 17644 flags.go:64] FLAG: --enable-server="true" Mar 19 11:59:26.304812 master-0 kubenswrapper[17644]: I0319 11:59:26.298801 17644 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 19 11:59:26.304812 master-0 kubenswrapper[17644]: I0319 11:59:26.298808 17644 flags.go:64] FLAG: --event-burst="100" Mar 19 11:59:26.304812 master-0 kubenswrapper[17644]: I0319 11:59:26.298814 17644 flags.go:64] FLAG: --event-qps="50" Mar 19 11:59:26.304812 master-0 kubenswrapper[17644]: I0319 11:59:26.298819 17644 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 19 11:59:26.304812 master-0 kubenswrapper[17644]: I0319 11:59:26.298824 17644 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 19 11:59:26.304812 master-0 kubenswrapper[17644]: I0319 11:59:26.298829 17644 flags.go:64] FLAG: --eviction-hard="" Mar 19 11:59:26.304812 master-0 kubenswrapper[17644]: I0319 11:59:26.298836 17644 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 19 11:59:26.304812 master-0 kubenswrapper[17644]: I0319 11:59:26.298842 17644 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 19 11:59:26.304812 master-0 kubenswrapper[17644]: I0319 11:59:26.298847 17644 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 19 11:59:26.304812 master-0 kubenswrapper[17644]: I0319 11:59:26.298852 17644 flags.go:64] FLAG: --eviction-soft="" Mar 19 11:59:26.304812 master-0 kubenswrapper[17644]: I0319 11:59:26.298857 17644 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 19 11:59:26.304812 master-0 kubenswrapper[17644]: I0319 11:59:26.298863 17644 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 19 11:59:26.304812 master-0 kubenswrapper[17644]: I0319 11:59:26.298868 17644 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 19 11:59:26.304812 master-0 kubenswrapper[17644]: I0319 11:59:26.298873 17644 flags.go:64] FLAG: --experimental-mounter-path="" Mar 19 11:59:26.304812 master-0 kubenswrapper[17644]: I0319 11:59:26.298882 17644 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 19 11:59:26.304812 master-0 kubenswrapper[17644]: I0319 11:59:26.298887 17644 flags.go:64] FLAG: --fail-swap-on="true" Mar 19 11:59:26.304812 master-0 kubenswrapper[17644]: I0319 11:59:26.298894 17644 flags.go:64] FLAG: --feature-gates="" Mar 19 11:59:26.304812 master-0 kubenswrapper[17644]: I0319 11:59:26.298901 17644 flags.go:64] FLAG: --file-check-frequency="20s" Mar 19 11:59:26.304812 master-0 kubenswrapper[17644]: I0319 11:59:26.298906 17644 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 19 11:59:26.304812 master-0 kubenswrapper[17644]: I0319 11:59:26.298912 17644 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.298917 17644 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.298922 17644 flags.go:64] FLAG: --healthz-port="10248" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.298928 17644 flags.go:64] FLAG: --help="false" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.298933 17644 flags.go:64] FLAG: --hostname-override="" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.298939 17644 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.298944 17644 flags.go:64] FLAG: --http-check-frequency="20s" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.298949 17644 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.298957 17644 flags.go:64] FLAG: --image-credential-provider-config="" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.298963 17644 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.298968 17644 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.298973 17644 flags.go:64] FLAG: --image-service-endpoint="" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.298978 17644 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.298983 17644 flags.go:64] FLAG: --kube-api-burst="100" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.298989 17644 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.298994 17644 flags.go:64] FLAG: --kube-api-qps="50" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.299000 17644 flags.go:64] FLAG: --kube-reserved="" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.299004 17644 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.299009 17644 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.299014 17644 flags.go:64] FLAG: --kubelet-cgroups="" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.299019 17644 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.299025 17644 flags.go:64] FLAG: --lock-file="" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.299030 17644 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.299036 17644 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.299041 17644 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.299050 17644 flags.go:64] FLAG: --log-json-split-stream="false" Mar 19 11:59:26.305798 master-0 kubenswrapper[17644]: I0319 11:59:26.299055 17644 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 19 11:59:26.306573 master-0 kubenswrapper[17644]: I0319 11:59:26.299064 17644 flags.go:64] FLAG: --log-text-split-stream="false" Mar 19 11:59:26.306573 master-0 kubenswrapper[17644]: I0319 11:59:26.299069 17644 flags.go:64] FLAG: --logging-format="text" Mar 19 11:59:26.306573 master-0 kubenswrapper[17644]: I0319 11:59:26.299074 17644 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 19 11:59:26.306573 master-0 kubenswrapper[17644]: I0319 11:59:26.299080 17644 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 19 11:59:26.306573 master-0 kubenswrapper[17644]: I0319 11:59:26.299086 17644 flags.go:64] FLAG: --manifest-url="" Mar 19 11:59:26.306573 master-0 kubenswrapper[17644]: I0319 11:59:26.299091 17644 flags.go:64] FLAG: --manifest-url-header="" Mar 19 11:59:26.306573 master-0 kubenswrapper[17644]: I0319 11:59:26.299099 17644 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 19 11:59:26.306573 master-0 kubenswrapper[17644]: I0319 11:59:26.299104 17644 flags.go:64] FLAG: --max-open-files="1000000" Mar 19 11:59:26.306573 master-0 kubenswrapper[17644]: I0319 11:59:26.299111 17644 flags.go:64] FLAG: --max-pods="110" Mar 19 11:59:26.306573 master-0 kubenswrapper[17644]: I0319 11:59:26.299116 17644 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 19 11:59:26.306573 master-0 kubenswrapper[17644]: I0319 11:59:26.299121 17644 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 19 11:59:26.306573 master-0 kubenswrapper[17644]: I0319 11:59:26.299126 17644 flags.go:64] FLAG: --memory-manager-policy="None" Mar 19 11:59:26.306573 master-0 kubenswrapper[17644]: I0319 11:59:26.299132 17644 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 19 11:59:26.306573 master-0 kubenswrapper[17644]: I0319 11:59:26.299140 17644 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 19 11:59:26.306573 master-0 kubenswrapper[17644]: I0319 11:59:26.299146 17644 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 19 11:59:26.306573 master-0 kubenswrapper[17644]: I0319 11:59:26.299151 17644 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 19 11:59:26.306573 master-0 kubenswrapper[17644]: I0319 11:59:26.299164 17644 flags.go:64] FLAG: --node-status-max-images="50" Mar 19 11:59:26.306573 master-0 kubenswrapper[17644]: I0319 11:59:26.299169 17644 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 19 11:59:26.306573 master-0 kubenswrapper[17644]: I0319 11:59:26.299174 17644 flags.go:64] FLAG: --oom-score-adj="-999" Mar 19 11:59:26.306573 master-0 kubenswrapper[17644]: I0319 11:59:26.299180 17644 flags.go:64] FLAG: --pod-cidr="" Mar 19 11:59:26.306573 master-0 kubenswrapper[17644]: I0319 11:59:26.299185 17644 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:53d66d524ca3e787d8dbe30dbc4d9b8612c9cebd505ccb4375a8441814e85422" Mar 19 11:59:26.306573 master-0 kubenswrapper[17644]: I0319 11:59:26.299195 17644 flags.go:64] FLAG: --pod-manifest-path="" Mar 19 11:59:26.306573 master-0 kubenswrapper[17644]: I0319 11:59:26.299200 17644 flags.go:64] FLAG: --pod-max-pids="-1" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299206 17644 flags.go:64] FLAG: --pods-per-core="0" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299211 17644 flags.go:64] FLAG: --port="10250" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299216 17644 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299222 17644 flags.go:64] FLAG: --provider-id="" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299227 17644 flags.go:64] FLAG: --qos-reserved="" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299232 17644 flags.go:64] FLAG: --read-only-port="10255" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299238 17644 flags.go:64] FLAG: --register-node="true" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299243 17644 flags.go:64] FLAG: --register-schedulable="true" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299249 17644 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299259 17644 flags.go:64] FLAG: --registry-burst="10" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299268 17644 flags.go:64] FLAG: --registry-qps="5" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299273 17644 flags.go:64] FLAG: --reserved-cpus="" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299278 17644 flags.go:64] FLAG: --reserved-memory="" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299285 17644 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299291 17644 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299296 17644 flags.go:64] FLAG: --rotate-certificates="false" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299302 17644 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299307 17644 flags.go:64] FLAG: --runonce="false" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299312 17644 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299318 17644 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299324 17644 flags.go:64] FLAG: --seccomp-default="false" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299329 17644 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299337 17644 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299343 17644 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299348 17644 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 19 11:59:26.307227 master-0 kubenswrapper[17644]: I0319 11:59:26.299354 17644 flags.go:64] FLAG: --storage-driver-password="root" Mar 19 11:59:26.321415 master-0 kubenswrapper[17644]: I0319 11:59:26.299360 17644 flags.go:64] FLAG: --storage-driver-secure="false" Mar 19 11:59:26.321415 master-0 kubenswrapper[17644]: I0319 11:59:26.299365 17644 flags.go:64] FLAG: --storage-driver-table="stats" Mar 19 11:59:26.321415 master-0 kubenswrapper[17644]: I0319 11:59:26.299370 17644 flags.go:64] FLAG: --storage-driver-user="root" Mar 19 11:59:26.321415 master-0 kubenswrapper[17644]: I0319 11:59:26.299375 17644 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 19 11:59:26.321415 master-0 kubenswrapper[17644]: I0319 11:59:26.299380 17644 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 19 11:59:26.321415 master-0 kubenswrapper[17644]: I0319 11:59:26.299386 17644 flags.go:64] FLAG: --system-cgroups="" Mar 19 11:59:26.321415 master-0 kubenswrapper[17644]: I0319 11:59:26.299391 17644 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 19 11:59:26.321415 master-0 kubenswrapper[17644]: I0319 11:59:26.299400 17644 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 19 11:59:26.321415 master-0 kubenswrapper[17644]: I0319 11:59:26.299405 17644 flags.go:64] FLAG: --tls-cert-file="" Mar 19 11:59:26.321415 master-0 kubenswrapper[17644]: I0319 11:59:26.299411 17644 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 19 11:59:26.321415 master-0 kubenswrapper[17644]: I0319 11:59:26.299418 17644 flags.go:64] FLAG: --tls-min-version="" Mar 19 11:59:26.321415 master-0 kubenswrapper[17644]: I0319 11:59:26.299423 17644 flags.go:64] FLAG: --tls-private-key-file="" Mar 19 11:59:26.321415 master-0 kubenswrapper[17644]: I0319 11:59:26.299429 17644 flags.go:64] FLAG: --topology-manager-policy="none" Mar 19 11:59:26.321415 master-0 kubenswrapper[17644]: I0319 11:59:26.299434 17644 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 19 11:59:26.321415 master-0 kubenswrapper[17644]: I0319 11:59:26.299439 17644 flags.go:64] FLAG: --topology-manager-scope="container" Mar 19 11:59:26.321415 master-0 kubenswrapper[17644]: I0319 11:59:26.299444 17644 flags.go:64] FLAG: --v="2" Mar 19 11:59:26.321415 master-0 kubenswrapper[17644]: I0319 11:59:26.299458 17644 flags.go:64] FLAG: --version="false" Mar 19 11:59:26.321415 master-0 kubenswrapper[17644]: I0319 11:59:26.299465 17644 flags.go:64] FLAG: --vmodule="" Mar 19 11:59:26.321415 master-0 kubenswrapper[17644]: I0319 11:59:26.299472 17644 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 19 11:59:26.321415 master-0 kubenswrapper[17644]: I0319 11:59:26.299478 17644 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 19 11:59:26.321415 master-0 kubenswrapper[17644]: W0319 11:59:26.299599 17644 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 11:59:26.321415 master-0 kubenswrapper[17644]: W0319 11:59:26.299608 17644 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 11:59:26.321415 master-0 kubenswrapper[17644]: W0319 11:59:26.299613 17644 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 11:59:26.321415 master-0 kubenswrapper[17644]: W0319 11:59:26.299618 17644 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 11:59:26.322319 master-0 kubenswrapper[17644]: W0319 11:59:26.299623 17644 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 11:59:26.322319 master-0 kubenswrapper[17644]: W0319 11:59:26.299628 17644 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 11:59:26.322319 master-0 kubenswrapper[17644]: W0319 11:59:26.299633 17644 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 11:59:26.322319 master-0 kubenswrapper[17644]: W0319 11:59:26.299639 17644 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 11:59:26.322319 master-0 kubenswrapper[17644]: W0319 11:59:26.299648 17644 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 11:59:26.322319 master-0 kubenswrapper[17644]: W0319 11:59:26.299655 17644 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 11:59:26.322319 master-0 kubenswrapper[17644]: W0319 11:59:26.299661 17644 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 11:59:26.322319 master-0 kubenswrapper[17644]: W0319 11:59:26.299667 17644 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 11:59:26.322319 master-0 kubenswrapper[17644]: W0319 11:59:26.299673 17644 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 11:59:26.322319 master-0 kubenswrapper[17644]: W0319 11:59:26.299679 17644 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 11:59:26.322319 master-0 kubenswrapper[17644]: W0319 11:59:26.299683 17644 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 11:59:26.322319 master-0 kubenswrapper[17644]: W0319 11:59:26.299688 17644 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 11:59:26.322319 master-0 kubenswrapper[17644]: W0319 11:59:26.299694 17644 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 11:59:26.322319 master-0 kubenswrapper[17644]: W0319 11:59:26.299698 17644 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 11:59:26.322319 master-0 kubenswrapper[17644]: W0319 11:59:26.299704 17644 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 11:59:26.322319 master-0 kubenswrapper[17644]: W0319 11:59:26.299708 17644 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 11:59:26.322319 master-0 kubenswrapper[17644]: W0319 11:59:26.299713 17644 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 11:59:26.322319 master-0 kubenswrapper[17644]: W0319 11:59:26.299718 17644 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 11:59:26.322983 master-0 kubenswrapper[17644]: W0319 11:59:26.299722 17644 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 11:59:26.322983 master-0 kubenswrapper[17644]: W0319 11:59:26.299747 17644 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 11:59:26.322983 master-0 kubenswrapper[17644]: W0319 11:59:26.299753 17644 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 11:59:26.322983 master-0 kubenswrapper[17644]: W0319 11:59:26.299758 17644 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 11:59:26.322983 master-0 kubenswrapper[17644]: W0319 11:59:26.299763 17644 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 11:59:26.322983 master-0 kubenswrapper[17644]: W0319 11:59:26.299768 17644 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 11:59:26.322983 master-0 kubenswrapper[17644]: W0319 11:59:26.299776 17644 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 11:59:26.322983 master-0 kubenswrapper[17644]: W0319 11:59:26.299781 17644 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 11:59:26.322983 master-0 kubenswrapper[17644]: W0319 11:59:26.299786 17644 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 11:59:26.322983 master-0 kubenswrapper[17644]: W0319 11:59:26.299791 17644 feature_gate.go:330] unrecognized feature gate: Example Mar 19 11:59:26.322983 master-0 kubenswrapper[17644]: W0319 11:59:26.299795 17644 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 11:59:26.322983 master-0 kubenswrapper[17644]: W0319 11:59:26.299800 17644 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 11:59:26.322983 master-0 kubenswrapper[17644]: W0319 11:59:26.299805 17644 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 11:59:26.322983 master-0 kubenswrapper[17644]: W0319 11:59:26.299810 17644 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 11:59:26.322983 master-0 kubenswrapper[17644]: W0319 11:59:26.299814 17644 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 11:59:26.322983 master-0 kubenswrapper[17644]: W0319 11:59:26.299819 17644 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 11:59:26.322983 master-0 kubenswrapper[17644]: W0319 11:59:26.299824 17644 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 11:59:26.322983 master-0 kubenswrapper[17644]: W0319 11:59:26.299830 17644 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 11:59:26.322983 master-0 kubenswrapper[17644]: W0319 11:59:26.299839 17644 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 11:59:26.322983 master-0 kubenswrapper[17644]: W0319 11:59:26.299844 17644 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 11:59:26.332342 master-0 kubenswrapper[17644]: W0319 11:59:26.299848 17644 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 11:59:26.332342 master-0 kubenswrapper[17644]: W0319 11:59:26.299853 17644 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 11:59:26.332342 master-0 kubenswrapper[17644]: W0319 11:59:26.299858 17644 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 11:59:26.332342 master-0 kubenswrapper[17644]: W0319 11:59:26.299864 17644 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 11:59:26.332342 master-0 kubenswrapper[17644]: W0319 11:59:26.299869 17644 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 11:59:26.332342 master-0 kubenswrapper[17644]: W0319 11:59:26.299873 17644 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 11:59:26.332342 master-0 kubenswrapper[17644]: W0319 11:59:26.299878 17644 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 11:59:26.332342 master-0 kubenswrapper[17644]: W0319 11:59:26.299882 17644 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 11:59:26.332342 master-0 kubenswrapper[17644]: W0319 11:59:26.299887 17644 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 11:59:26.332342 master-0 kubenswrapper[17644]: W0319 11:59:26.299891 17644 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 11:59:26.332342 master-0 kubenswrapper[17644]: W0319 11:59:26.299896 17644 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 11:59:26.332342 master-0 kubenswrapper[17644]: W0319 11:59:26.299902 17644 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 11:59:26.332342 master-0 kubenswrapper[17644]: W0319 11:59:26.299906 17644 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 11:59:26.332342 master-0 kubenswrapper[17644]: W0319 11:59:26.299911 17644 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 11:59:26.332342 master-0 kubenswrapper[17644]: W0319 11:59:26.299917 17644 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 11:59:26.332342 master-0 kubenswrapper[17644]: W0319 11:59:26.299921 17644 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 11:59:26.332342 master-0 kubenswrapper[17644]: W0319 11:59:26.299925 17644 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 11:59:26.332342 master-0 kubenswrapper[17644]: W0319 11:59:26.299930 17644 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 11:59:26.332342 master-0 kubenswrapper[17644]: W0319 11:59:26.299937 17644 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 11:59:26.332342 master-0 kubenswrapper[17644]: W0319 11:59:26.299941 17644 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 11:59:26.333070 master-0 kubenswrapper[17644]: W0319 11:59:26.299945 17644 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 11:59:26.333070 master-0 kubenswrapper[17644]: W0319 11:59:26.299950 17644 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 11:59:26.333070 master-0 kubenswrapper[17644]: W0319 11:59:26.299954 17644 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 11:59:26.333070 master-0 kubenswrapper[17644]: W0319 11:59:26.299959 17644 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 11:59:26.333070 master-0 kubenswrapper[17644]: W0319 11:59:26.299965 17644 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 11:59:26.333070 master-0 kubenswrapper[17644]: W0319 11:59:26.299971 17644 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 11:59:26.333070 master-0 kubenswrapper[17644]: W0319 11:59:26.299976 17644 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 11:59:26.333070 master-0 kubenswrapper[17644]: W0319 11:59:26.299981 17644 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 11:59:26.333070 master-0 kubenswrapper[17644]: W0319 11:59:26.299986 17644 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 11:59:26.333070 master-0 kubenswrapper[17644]: W0319 11:59:26.299990 17644 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 11:59:26.333070 master-0 kubenswrapper[17644]: I0319 11:59:26.300002 17644 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 11:59:26.333070 master-0 kubenswrapper[17644]: I0319 11:59:26.304189 17644 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 19 11:59:26.333070 master-0 kubenswrapper[17644]: I0319 11:59:26.304209 17644 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 19 11:59:26.333070 master-0 kubenswrapper[17644]: W0319 11:59:26.305015 17644 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 11:59:26.333070 master-0 kubenswrapper[17644]: W0319 11:59:26.305072 17644 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 11:59:26.333580 master-0 kubenswrapper[17644]: W0319 11:59:26.305077 17644 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 11:59:26.333580 master-0 kubenswrapper[17644]: W0319 11:59:26.305083 17644 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 11:59:26.333580 master-0 kubenswrapper[17644]: W0319 11:59:26.305088 17644 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 11:59:26.333580 master-0 kubenswrapper[17644]: W0319 11:59:26.305093 17644 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 11:59:26.333580 master-0 kubenswrapper[17644]: W0319 11:59:26.305105 17644 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 11:59:26.333580 master-0 kubenswrapper[17644]: W0319 11:59:26.305110 17644 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 11:59:26.333580 master-0 kubenswrapper[17644]: W0319 11:59:26.305115 17644 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 11:59:26.333580 master-0 kubenswrapper[17644]: W0319 11:59:26.305119 17644 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 11:59:26.333580 master-0 kubenswrapper[17644]: W0319 11:59:26.305124 17644 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 11:59:26.333580 master-0 kubenswrapper[17644]: W0319 11:59:26.305129 17644 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 11:59:26.333580 master-0 kubenswrapper[17644]: W0319 11:59:26.305133 17644 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 11:59:26.333580 master-0 kubenswrapper[17644]: W0319 11:59:26.305138 17644 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 11:59:26.333580 master-0 kubenswrapper[17644]: W0319 11:59:26.305142 17644 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 11:59:26.333580 master-0 kubenswrapper[17644]: W0319 11:59:26.305147 17644 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 11:59:26.333580 master-0 kubenswrapper[17644]: W0319 11:59:26.305151 17644 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 11:59:26.333580 master-0 kubenswrapper[17644]: W0319 11:59:26.305158 17644 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 11:59:26.333580 master-0 kubenswrapper[17644]: W0319 11:59:26.305167 17644 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 11:59:26.333580 master-0 kubenswrapper[17644]: W0319 11:59:26.305173 17644 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 11:59:26.333580 master-0 kubenswrapper[17644]: W0319 11:59:26.305181 17644 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 11:59:26.333580 master-0 kubenswrapper[17644]: W0319 11:59:26.305190 17644 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 11:59:26.334246 master-0 kubenswrapper[17644]: W0319 11:59:26.305196 17644 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 11:59:26.334246 master-0 kubenswrapper[17644]: W0319 11:59:26.305200 17644 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 11:59:26.334246 master-0 kubenswrapper[17644]: W0319 11:59:26.305205 17644 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 11:59:26.334246 master-0 kubenswrapper[17644]: W0319 11:59:26.305211 17644 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 11:59:26.334246 master-0 kubenswrapper[17644]: W0319 11:59:26.305216 17644 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 11:59:26.334246 master-0 kubenswrapper[17644]: W0319 11:59:26.305221 17644 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 11:59:26.334246 master-0 kubenswrapper[17644]: W0319 11:59:26.305225 17644 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 11:59:26.334246 master-0 kubenswrapper[17644]: W0319 11:59:26.305232 17644 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 11:59:26.334246 master-0 kubenswrapper[17644]: W0319 11:59:26.305236 17644 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 11:59:26.334246 master-0 kubenswrapper[17644]: W0319 11:59:26.305247 17644 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 11:59:26.334246 master-0 kubenswrapper[17644]: W0319 11:59:26.305252 17644 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 11:59:26.334246 master-0 kubenswrapper[17644]: W0319 11:59:26.305259 17644 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 11:59:26.334246 master-0 kubenswrapper[17644]: W0319 11:59:26.305264 17644 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 11:59:26.334246 master-0 kubenswrapper[17644]: W0319 11:59:26.305268 17644 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 11:59:26.334246 master-0 kubenswrapper[17644]: W0319 11:59:26.305276 17644 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 11:59:26.334246 master-0 kubenswrapper[17644]: W0319 11:59:26.305281 17644 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 11:59:26.334246 master-0 kubenswrapper[17644]: W0319 11:59:26.305286 17644 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 11:59:26.334246 master-0 kubenswrapper[17644]: W0319 11:59:26.305291 17644 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 11:59:26.334246 master-0 kubenswrapper[17644]: W0319 11:59:26.305296 17644 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 11:59:26.334246 master-0 kubenswrapper[17644]: W0319 11:59:26.305300 17644 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 11:59:26.350453 master-0 kubenswrapper[17644]: W0319 11:59:26.305305 17644 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 11:59:26.350453 master-0 kubenswrapper[17644]: W0319 11:59:26.305310 17644 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 11:59:26.350453 master-0 kubenswrapper[17644]: W0319 11:59:26.305318 17644 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 11:59:26.350453 master-0 kubenswrapper[17644]: W0319 11:59:26.305323 17644 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 11:59:26.350453 master-0 kubenswrapper[17644]: W0319 11:59:26.305328 17644 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 11:59:26.350453 master-0 kubenswrapper[17644]: W0319 11:59:26.305332 17644 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 11:59:26.350453 master-0 kubenswrapper[17644]: W0319 11:59:26.305338 17644 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 11:59:26.350453 master-0 kubenswrapper[17644]: W0319 11:59:26.305345 17644 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 11:59:26.350453 master-0 kubenswrapper[17644]: W0319 11:59:26.305352 17644 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 11:59:26.350453 master-0 kubenswrapper[17644]: W0319 11:59:26.305360 17644 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 11:59:26.350453 master-0 kubenswrapper[17644]: W0319 11:59:26.305366 17644 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 11:59:26.350453 master-0 kubenswrapper[17644]: W0319 11:59:26.305372 17644 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 11:59:26.350453 master-0 kubenswrapper[17644]: W0319 11:59:26.305377 17644 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 11:59:26.350453 master-0 kubenswrapper[17644]: W0319 11:59:26.305386 17644 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 11:59:26.350453 master-0 kubenswrapper[17644]: W0319 11:59:26.305391 17644 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 11:59:26.350453 master-0 kubenswrapper[17644]: W0319 11:59:26.305396 17644 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 11:59:26.350453 master-0 kubenswrapper[17644]: W0319 11:59:26.305402 17644 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 11:59:26.350453 master-0 kubenswrapper[17644]: W0319 11:59:26.305406 17644 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 11:59:26.350453 master-0 kubenswrapper[17644]: W0319 11:59:26.305411 17644 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 11:59:26.351184 master-0 kubenswrapper[17644]: W0319 11:59:26.305417 17644 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 11:59:26.351184 master-0 kubenswrapper[17644]: W0319 11:59:26.305422 17644 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 11:59:26.351184 master-0 kubenswrapper[17644]: W0319 11:59:26.305427 17644 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 11:59:26.351184 master-0 kubenswrapper[17644]: W0319 11:59:26.305432 17644 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 11:59:26.351184 master-0 kubenswrapper[17644]: W0319 11:59:26.305437 17644 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 11:59:26.351184 master-0 kubenswrapper[17644]: W0319 11:59:26.305481 17644 feature_gate.go:330] unrecognized feature gate: Example Mar 19 11:59:26.351184 master-0 kubenswrapper[17644]: W0319 11:59:26.305488 17644 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 11:59:26.351184 master-0 kubenswrapper[17644]: W0319 11:59:26.305500 17644 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 11:59:26.351184 master-0 kubenswrapper[17644]: W0319 11:59:26.305506 17644 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 11:59:26.351184 master-0 kubenswrapper[17644]: W0319 11:59:26.305511 17644 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 11:59:26.351184 master-0 kubenswrapper[17644]: W0319 11:59:26.305516 17644 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 11:59:26.351184 master-0 kubenswrapper[17644]: I0319 11:59:26.305528 17644 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 11:59:26.351184 master-0 kubenswrapper[17644]: W0319 11:59:26.305989 17644 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 11:59:26.351184 master-0 kubenswrapper[17644]: W0319 11:59:26.306036 17644 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 11:59:26.351184 master-0 kubenswrapper[17644]: W0319 11:59:26.306041 17644 feature_gate.go:330] unrecognized feature gate: Example Mar 19 11:59:26.351721 master-0 kubenswrapper[17644]: W0319 11:59:26.306046 17644 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 11:59:26.351721 master-0 kubenswrapper[17644]: W0319 11:59:26.306050 17644 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 11:59:26.351721 master-0 kubenswrapper[17644]: W0319 11:59:26.306054 17644 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 11:59:26.351721 master-0 kubenswrapper[17644]: W0319 11:59:26.306060 17644 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 11:59:26.351721 master-0 kubenswrapper[17644]: W0319 11:59:26.306064 17644 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 11:59:26.351721 master-0 kubenswrapper[17644]: W0319 11:59:26.306074 17644 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 11:59:26.351721 master-0 kubenswrapper[17644]: W0319 11:59:26.306084 17644 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 11:59:26.351721 master-0 kubenswrapper[17644]: W0319 11:59:26.306089 17644 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 11:59:26.351721 master-0 kubenswrapper[17644]: W0319 11:59:26.306107 17644 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 11:59:26.351721 master-0 kubenswrapper[17644]: W0319 11:59:26.306111 17644 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 11:59:26.351721 master-0 kubenswrapper[17644]: W0319 11:59:26.306116 17644 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 11:59:26.351721 master-0 kubenswrapper[17644]: W0319 11:59:26.306120 17644 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 11:59:26.351721 master-0 kubenswrapper[17644]: W0319 11:59:26.306125 17644 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 11:59:26.351721 master-0 kubenswrapper[17644]: W0319 11:59:26.306130 17644 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 11:59:26.351721 master-0 kubenswrapper[17644]: W0319 11:59:26.306134 17644 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 11:59:26.351721 master-0 kubenswrapper[17644]: W0319 11:59:26.306138 17644 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 11:59:26.351721 master-0 kubenswrapper[17644]: W0319 11:59:26.306142 17644 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 11:59:26.351721 master-0 kubenswrapper[17644]: W0319 11:59:26.306149 17644 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 11:59:26.351721 master-0 kubenswrapper[17644]: W0319 11:59:26.306153 17644 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 11:59:26.352415 master-0 kubenswrapper[17644]: W0319 11:59:26.306157 17644 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 11:59:26.352415 master-0 kubenswrapper[17644]: W0319 11:59:26.306162 17644 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 11:59:26.352415 master-0 kubenswrapper[17644]: W0319 11:59:26.306168 17644 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 11:59:26.352415 master-0 kubenswrapper[17644]: W0319 11:59:26.306174 17644 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 11:59:26.352415 master-0 kubenswrapper[17644]: W0319 11:59:26.306179 17644 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 11:59:26.352415 master-0 kubenswrapper[17644]: W0319 11:59:26.306184 17644 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 11:59:26.352415 master-0 kubenswrapper[17644]: W0319 11:59:26.306190 17644 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 11:59:26.352415 master-0 kubenswrapper[17644]: W0319 11:59:26.306195 17644 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 11:59:26.352415 master-0 kubenswrapper[17644]: W0319 11:59:26.306200 17644 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 11:59:26.352415 master-0 kubenswrapper[17644]: W0319 11:59:26.306204 17644 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 11:59:26.352415 master-0 kubenswrapper[17644]: W0319 11:59:26.306211 17644 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 11:59:26.352415 master-0 kubenswrapper[17644]: W0319 11:59:26.306217 17644 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 11:59:26.352415 master-0 kubenswrapper[17644]: W0319 11:59:26.306221 17644 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 11:59:26.352415 master-0 kubenswrapper[17644]: W0319 11:59:26.306225 17644 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 11:59:26.352415 master-0 kubenswrapper[17644]: W0319 11:59:26.306231 17644 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 11:59:26.352415 master-0 kubenswrapper[17644]: W0319 11:59:26.306236 17644 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 11:59:26.352415 master-0 kubenswrapper[17644]: W0319 11:59:26.306240 17644 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 11:59:26.352415 master-0 kubenswrapper[17644]: W0319 11:59:26.306244 17644 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 11:59:26.352415 master-0 kubenswrapper[17644]: W0319 11:59:26.306248 17644 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 11:59:26.353197 master-0 kubenswrapper[17644]: W0319 11:59:26.306251 17644 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 11:59:26.353197 master-0 kubenswrapper[17644]: W0319 11:59:26.306256 17644 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 11:59:26.353197 master-0 kubenswrapper[17644]: W0319 11:59:26.306259 17644 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 11:59:26.353197 master-0 kubenswrapper[17644]: W0319 11:59:26.306263 17644 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 11:59:26.353197 master-0 kubenswrapper[17644]: W0319 11:59:26.306282 17644 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 11:59:26.353197 master-0 kubenswrapper[17644]: W0319 11:59:26.306286 17644 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 11:59:26.353197 master-0 kubenswrapper[17644]: W0319 11:59:26.306290 17644 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 11:59:26.353197 master-0 kubenswrapper[17644]: W0319 11:59:26.306294 17644 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 11:59:26.353197 master-0 kubenswrapper[17644]: W0319 11:59:26.306297 17644 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 11:59:26.353197 master-0 kubenswrapper[17644]: W0319 11:59:26.306302 17644 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 11:59:26.353197 master-0 kubenswrapper[17644]: W0319 11:59:26.306305 17644 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 11:59:26.353197 master-0 kubenswrapper[17644]: W0319 11:59:26.306310 17644 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 11:59:26.353197 master-0 kubenswrapper[17644]: W0319 11:59:26.306315 17644 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 11:59:26.353197 master-0 kubenswrapper[17644]: W0319 11:59:26.306320 17644 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 11:59:26.353197 master-0 kubenswrapper[17644]: W0319 11:59:26.306324 17644 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 11:59:26.353197 master-0 kubenswrapper[17644]: W0319 11:59:26.306328 17644 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 11:59:26.353197 master-0 kubenswrapper[17644]: W0319 11:59:26.306334 17644 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 11:59:26.353197 master-0 kubenswrapper[17644]: W0319 11:59:26.306338 17644 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 11:59:26.353197 master-0 kubenswrapper[17644]: W0319 11:59:26.306342 17644 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 11:59:26.353197 master-0 kubenswrapper[17644]: W0319 11:59:26.306348 17644 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 11:59:26.353956 master-0 kubenswrapper[17644]: W0319 11:59:26.306353 17644 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 11:59:26.353956 master-0 kubenswrapper[17644]: W0319 11:59:26.306358 17644 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 11:59:26.353956 master-0 kubenswrapper[17644]: W0319 11:59:26.306363 17644 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 11:59:26.353956 master-0 kubenswrapper[17644]: W0319 11:59:26.306367 17644 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 11:59:26.353956 master-0 kubenswrapper[17644]: W0319 11:59:26.306372 17644 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 11:59:26.353956 master-0 kubenswrapper[17644]: W0319 11:59:26.306377 17644 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 11:59:26.353956 master-0 kubenswrapper[17644]: W0319 11:59:26.306382 17644 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 11:59:26.353956 master-0 kubenswrapper[17644]: W0319 11:59:26.306386 17644 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 11:59:26.353956 master-0 kubenswrapper[17644]: W0319 11:59:26.306391 17644 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 11:59:26.353956 master-0 kubenswrapper[17644]: W0319 11:59:26.306399 17644 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 11:59:26.353956 master-0 kubenswrapper[17644]: W0319 11:59:26.306404 17644 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 11:59:26.353956 master-0 kubenswrapper[17644]: I0319 11:59:26.306419 17644 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 11:59:26.353956 master-0 kubenswrapper[17644]: I0319 11:59:26.306785 17644 server.go:940] "Client rotation is on, will bootstrap in background" Mar 19 11:59:26.353956 master-0 kubenswrapper[17644]: I0319 11:59:26.338017 17644 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Mar 19 11:59:26.353956 master-0 kubenswrapper[17644]: I0319 11:59:26.338155 17644 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 19 11:59:26.354597 master-0 kubenswrapper[17644]: I0319 11:59:26.338446 17644 server.go:997] "Starting client certificate rotation" Mar 19 11:59:26.354597 master-0 kubenswrapper[17644]: I0319 11:59:26.338458 17644 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 19 11:59:26.354597 master-0 kubenswrapper[17644]: I0319 11:59:26.338640 17644 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-20 11:43:03 +0000 UTC, rotation deadline is 2026-03-20 06:30:39.892746622 +0000 UTC Mar 19 11:59:26.354597 master-0 kubenswrapper[17644]: I0319 11:59:26.338709 17644 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 18h31m13.554040177s for next certificate rotation Mar 19 11:59:26.354597 master-0 kubenswrapper[17644]: I0319 11:59:26.339194 17644 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 11:59:26.354597 master-0 kubenswrapper[17644]: I0319 11:59:26.340720 17644 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 11:59:26.354597 master-0 kubenswrapper[17644]: I0319 11:59:26.352767 17644 log.go:25] "Validated CRI v1 runtime API" Mar 19 11:59:26.359331 master-0 kubenswrapper[17644]: I0319 11:59:26.358806 17644 log.go:25] "Validated CRI v1 image API" Mar 19 11:59:26.362274 master-0 kubenswrapper[17644]: I0319 11:59:26.362220 17644 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 19 11:59:26.373876 master-0 kubenswrapper[17644]: I0319 11:59:26.373806 17644 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/vda2 84bbc972-b2a6-48d9-8e4d-c9ff50fad0b0:/dev/vda3 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Mar 19 11:59:26.374984 master-0 kubenswrapper[17644]: I0319 11:59:26.373857 17644 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/01bd4a2802323b3faf679fc3ea0fe20efacc45eab046badf1be6c2b07116febc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/01bd4a2802323b3faf679fc3ea0fe20efacc45eab046badf1be6c2b07116febc/userdata/shm major:0 minor:816 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/12f1d4709c9e0d0ad1a233908194f29f84992ca4b99bba4692dddc9f4338c1ef/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/12f1d4709c9e0d0ad1a233908194f29f84992ca4b99bba4692dddc9f4338c1ef/userdata/shm major:0 minor:443 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/14576382107dc09a133f25dfe11c859b57d691f83816910915dfdbd5db8c6773/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/14576382107dc09a133f25dfe11c859b57d691f83816910915dfdbd5db8c6773/userdata/shm major:0 minor:622 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/159515e88e5e657c5dc1a45dfc38f542a76bac41085e0be14941a32b19e214ef/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/159515e88e5e657c5dc1a45dfc38f542a76bac41085e0be14941a32b19e214ef/userdata/shm major:0 minor:276 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/17a53907f75f6dae7caa627268daa345a6154ff885830dae9a1873ed761e0552/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/17a53907f75f6dae7caa627268daa345a6154ff885830dae9a1873ed761e0552/userdata/shm major:0 minor:1095 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/18f4fa41b32bbdc0315d2c159f68c3407a8234dacc09fe18dea04525d0e88d8c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/18f4fa41b32bbdc0315d2c159f68c3407a8234dacc09fe18dea04525d0e88d8c/userdata/shm major:0 minor:808 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2151069adcf5d6126fb57190dc2ec941b6dc342421174da0283b995f56e1641b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2151069adcf5d6126fb57190dc2ec941b6dc342421174da0283b995f56e1641b/userdata/shm major:0 minor:129 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/23f47642dd95b352c86bf3516967ac9ae86ccfd441d6afb36a3e2d4a5c622a4a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/23f47642dd95b352c86bf3516967ac9ae86ccfd441d6afb36a3e2d4a5c622a4a/userdata/shm major:0 minor:144 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/25dfae9bb0843173d90c844dacf16818cb3d6d61cb972bb6cd1177b47a320778/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/25dfae9bb0843173d90c844dacf16818cb3d6d61cb972bb6cd1177b47a320778/userdata/shm major:0 minor:262 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/26bcc676a684bba59ce239a7b0c6d837715bffea1d6d9d661570c6d71c3af31c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/26bcc676a684bba59ce239a7b0c6d837715bffea1d6d9d661570c6d71c3af31c/userdata/shm major:0 minor:354 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/27419838ac6bb228f6151c74e466e550ee30c7ce1c14772f63c150dcd524d6e7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/27419838ac6bb228f6151c74e466e550ee30c7ce1c14772f63c150dcd524d6e7/userdata/shm major:0 minor:279 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/28a32f59656edf5ebf4428eb19343f79c79bdc3e9a5ed63a5fa7185ccacbd30e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/28a32f59656edf5ebf4428eb19343f79c79bdc3e9a5ed63a5fa7185ccacbd30e/userdata/shm major:0 minor:282 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2c0b2d29cecf537e4921aa4396580e1259d6519819244de28dc54db9b3eeb9d0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2c0b2d29cecf537e4921aa4396580e1259d6519819244de28dc54db9b3eeb9d0/userdata/shm major:0 minor:656 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/36816d6fc70a9260d540c9487629bb4d582fa5330a4c11074ee3f05c1e9cbe38/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/36816d6fc70a9260d540c9487629bb4d582fa5330a4c11074ee3f05c1e9cbe38/userdata/shm major:0 minor:969 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/38cb26629a14fdae9d7f35eac30d1706193c11f4823405b1ab890376e3178bdd/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/38cb26629a14fdae9d7f35eac30d1706193c11f4823405b1ab890376e3178bdd/userdata/shm major:0 minor:684 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3ca871a2e4c187593092b1e6a4a9637d7435e4628b01bcadfea7c6a9560eeb21/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3ca871a2e4c187593092b1e6a4a9637d7435e4628b01bcadfea7c6a9560eeb21/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/402476363b5df4845bdf76440169d41c48c7c304f89463a3160ab10c4b9c45da/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/402476363b5df4845bdf76440169d41c48c7c304f89463a3160ab10c4b9c45da/userdata/shm major:0 minor:331 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/45a1f521d794b1ca367dab762c38f2fc2e98e9ba7d75ffaddb7fceef49fff20d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/45a1f521d794b1ca367dab762c38f2fc2e98e9ba7d75ffaddb7fceef49fff20d/userdata/shm major:0 minor:464 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/492d91fc21d30f80345040a63ee30545a1658028ca8d297dee64246b255c0fcb/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/492d91fc21d30f80345040a63ee30545a1658028ca8d297dee64246b255c0fcb/userdata/shm major:0 minor:813 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4b35ea3ef7523bac2219f1d11ac9a4ce57129adbac9b8a1915c2a12e2d7a7c68/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4b35ea3ef7523bac2219f1d11ac9a4ce57129adbac9b8a1915c2a12e2d7a7c68/userdata/shm major:0 minor:308 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4f25a976585d22d9ce3955473a200e96837f45c766e321488b3d87050f023b7a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4f25a976585d22d9ce3955473a200e96837f45c766e321488b3d87050f023b7a/userdata/shm major:0 minor:439 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5359d955256489cf75babf6cd7e374f24ca5753414f295ec115bac354fbe37e1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5359d955256489cf75babf6cd7e374f24ca5753414f295ec115bac354fbe37e1/userdata/shm major:0 minor:249 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5a3a840584953aa05811a73f7731c28fab3047c34d3f28cfbf2a20aad97cf6c3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5a3a840584953aa05811a73f7731c28fab3047c34d3f28cfbf2a20aad97cf6c3/userdata/shm major:0 minor:1015 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/604f16a2ad7d04e1bbd75b7eca1988232760bcd65e1311be08e2c7a3cbb4c10a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/604f16a2ad7d04e1bbd75b7eca1988232760bcd65e1311be08e2c7a3cbb4c10a/userdata/shm major:0 minor:817 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6502c99aaf4d4f945a08ddd70ddf47028a9961291a598bc4054d9498e0e3049e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6502c99aaf4d4f945a08ddd70ddf47028a9961291a598bc4054d9498e0e3049e/userdata/shm major:0 minor:444 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/67cce31157aba8cd32c19fdb97a814cf6764c07048a060294135b6ce20e85f0e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/67cce31157aba8cd32c19fdb97a814cf6764c07048a060294135b6ce20e85f0e/userdata/shm major:0 minor:275 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/683aa0635e184216531580a563438a1b652c9e9d46d69283fd6cdf0548cf223d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/683aa0635e184216531580a563438a1b652c9e9d46d69283fd6cdf0548cf223d/userdata/shm major:0 minor:814 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6956b1980f4b04ce367cfbad3aeea7396b54e1517e031f7afdbbd760960fd241/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6956b1980f4b04ce367cfbad3aeea7396b54e1517e031f7afdbbd760960fd241/userdata/shm major:0 minor:1113 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/738ee83a26962b58779f847316e3a8a5be1d6bd92f0c0f29c25cdbb8703c5c59/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/738ee83a26962b58779f847316e3a8a5be1d6bd92f0c0f29c25cdbb8703c5c59/userdata/shm major:0 minor:771 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7444b7503d7740b7e0cd43f84f6cce1196456b0e8df5c1dc67a1f73e2797cf61/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7444b7503d7740b7e0cd43f84f6cce1196456b0e8df5c1dc67a1f73e2797cf61/userdata/shm major:0 minor:240 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/74d7d2df3602ec247c94c7641e1ca1523b5ae6b42624ca797fbd2b6225dfbfa4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/74d7d2df3602ec247c94c7641e1ca1523b5ae6b42624ca797fbd2b6225dfbfa4/userdata/shm major:0 minor:260 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8070d874c8e6aab4717f63db58f142956c8f18d4e16e21f12ce84898692af2f8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8070d874c8e6aab4717f63db58f142956c8f18d4e16e21f12ce84898692af2f8/userdata/shm major:0 minor:812 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/82ce23dbad1fafac03170cf8dbdc37b1358bba5d494b0305bc59731ec33ac062/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/82ce23dbad1fafac03170cf8dbdc37b1358bba5d494b0305bc59731ec33ac062/userdata/shm major:0 minor:614 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/832f700980eab592f836b89a6aebe98be99148aa95ac29165addb6fccc6389c3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/832f700980eab592f836b89a6aebe98be99148aa95ac29165addb6fccc6389c3/userdata/shm major:0 minor:650 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/857137fd3aca8af8c5c19bcaeff329a322e9a54b7ff7f19d360c176d0e68cab5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/857137fd3aca8af8c5c19bcaeff329a322e9a54b7ff7f19d360c176d0e68cab5/userdata/shm major:0 minor:254 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/862dbe8b648f15ee3ab2e74272152e657f518e1985ef0d38baf17c28a33a4abb/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/862dbe8b648f15ee3ab2e74272152e657f518e1985ef0d38baf17c28a33a4abb/userdata/shm major:0 minor:607 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8797022c969de9642db09ed804cdf4aed14c8648d4f8b5b9c9f88a55664979e8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8797022c969de9642db09ed804cdf4aed14c8648d4f8b5b9c9f88a55664979e8/userdata/shm major:0 minor:280 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/88f5dfffda4adf62f6636e4646d2c851172ef321255a628934b6453ee67c8f03/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/88f5dfffda4adf62f6636e4646d2c851172ef321255a628934b6453ee67c8f03/userdata/shm major:0 minor:128 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8ce4d1f32bb0cc2bd6719ecbc1bb660798af73ec1a021eb215e32bb686d9ba1b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8ce4d1f32bb0cc2bd6719ecbc1bb660798af73ec1a021eb215e32bb686d9ba1b/userdata/shm major:0 minor:1093 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8e1fd7c8f094ce1e4302e058e811af6aae2e3addf7bd81aa94568f27af29f0c9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8e1fd7c8f094ce1e4302e058e811af6aae2e3addf7bd81aa94568f27af29f0c9/userdata/shm major:0 minor:478 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8f6241ff25db322ca912b366aec02ce24e776e994e5454c053b2a00c5bd1a93b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8f6241ff25db322ca912b366aec02ce24e776e994e5454c053b2a00c5bd1a93b/userdata/shm major:0 minor:438 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/937722eb1a7c864cdcd30f00f097350a31b04584988bb543632ba097925b3bbe/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/937722eb1a7c864cdcd30f00f097350a31b04584988bb543632ba097925b3bbe/userdata/shm major:0 minor:62 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/94301b494e7fa86c5ac2e6fa986da464195a196e9774c438e5a44b6eb0b525ae/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/94301b494e7fa86c5ac2e6fa986da464195a196e9774c438e5a44b6eb0b525ae/userdata/shm major:0 minor:820 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/975e632bf87b61a6785fc741d9417b8abbd6243ba2abd8088f9fe581fcfef90c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/975e632bf87b61a6785fc741d9417b8abbd6243ba2abd8088f9fe581fcfef90c/userdata/shm major:0 minor:624 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9a2616ea0257b4942755b9e9fb23bb4dfd3518f40e9ffe96a9ef4230caaa00fe/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9a2616ea0257b4942755b9e9fb23bb4dfd3518f40e9ffe96a9ef4230caaa00fe/userdata/shm major:0 minor:557 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a00871e023b42142484ad987a4c2956151fb53dc58e2ab128b59501bf258f39e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a00871e023b42142484ad987a4c2956151fb53dc58e2ab128b59501bf258f39e/userdata/shm major:0 minor:1065 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a1c364bd3d663a56cc2f90bf6e8ea8c50127add36b90978697972f8218a89ed7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a1c364bd3d663a56cc2f90bf6e8ea8c50127add36b90978697972f8218a89ed7/userdata/shm major:0 minor:395 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a2b791c04ceadd3a171b7dda7655ef7534b61d799b6ce663909c8e48a8e61525/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a2b791c04ceadd3a171b7dda7655ef7534b61d799b6ce663909c8e48a8e61525/userdata/shm major:0 minor:1017 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a387a6fb603981d31a2529e0731ac72c41f84be90202777248f07296f1eb9d6b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a387a6fb603981d31a2529e0731ac72c41f84be90202777248f07296f1eb9d6b/userdata/shm major:0 minor:99 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/aa3b1b6a2b92eddacf5dacab6a0147cb12a4e498e9d143158aa50de12bd5c3b7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/aa3b1b6a2b92eddacf5dacab6a0147cb12a4e498e9d143158aa50de12bd5c3b7/userdata/shm major:0 minor:509 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/adc1d294cb2c4faeba2726e706421944c88f312613f6f9484ed976e0c65190f9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/adc1d294cb2c4faeba2726e706421944c88f312613f6f9484ed976e0c65190f9/userdata/shm major:0 minor:994 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b358fce6bb46e5b5037cb28d6e8fc423fe1541e849c427617b2d5f7f7a209743/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b358fce6bb46e5b5037cb28d6e8fc423fe1541e849c427617b2d5f7f7a209743/userdata/shm major:0 minor:818 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b54d0875a5c74a95cdb12684066d437927027dd749aa30fdd27e9a88de808b47/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b54d0875a5c74a95cdb12684066d437927027dd749aa30fdd27e9a88de808b47/userdata/shm major:0 minor:236 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/bca8253525b3cd943116e55714fdf37c6331867834b278964c5e6f5dd4c53fef/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/bca8253525b3cd943116e55714fdf37c6331867834b278964c5e6f5dd4c53fef/userdata/shm major:0 minor:441 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c0337cf9dcdc7cc749cac3adad0f44d0d5457a466ca84750f37317d1eb4a70f1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c0337cf9dcdc7cc749cac3adad0f44d0d5457a466ca84750f37317d1eb4a70f1/userdata/shm major:0 minor:699 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c21d5cdcf33dc5445d398db5efae2e61668498b313fd2a8200f2011b9857d1d4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c21d5cdcf33dc5445d398db5efae2e61668498b313fd2a8200f2011b9857d1d4/userdata/shm major:0 minor:50 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c238dcb10339e469e019f35f43263a486da7ad20431c7557165dd244d72db205/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c238dcb10339e469e019f35f43263a486da7ad20431c7557165dd244d72db205/userdata/shm major:0 minor:89 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c7fce19a33a5dd46ce06e3ec2001f8aae0d2c521be7c2647e59448b0833408c9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c7fce19a33a5dd46ce06e3ec2001f8aae0d2c521be7c2647e59448b0833408c9/userdata/shm major:0 minor:46 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/caa4e9bd96e874f51a79da89bbb64da72933b4ef3464772d351cf399d375866a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/caa4e9bd96e874f51a79da89bbb64da72933b4ef3464772d351cf399d375866a/userdata/shm major:0 minor:477 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d016f849de27f64d027bbd73120eb329b0253680086fad1c1a5d1d59daba5c27/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d016f849de27f64d027bbd73120eb329b0253680086fad1c1a5d1d59daba5c27/userdata/shm major:0 minor:111 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d26de8d7725dab288840f8eb4631a12a8821676d8fd47b0810577c9ee4f7e3b9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d26de8d7725dab288840f8eb4631a12a8821676d8fd47b0810577c9ee4f7e3b9/userdata/shm major:0 minor:1152 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d2ecc9d2456937963c5f6bc8147a2bbe973205b8a12f3d89082b348a330ba2e2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d2ecc9d2456937963c5f6bc8147a2bbe973205b8a12f3d89082b348a330ba2e2/userdata/shm major:0 minor:1047 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d6045bc934b39d2e74e105cd2ee97a2d4e1d69429d08a4cbb80aeb107f492bc3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d6045bc934b39d2e74e105cd2ee97a2d4e1d69429d08a4cbb80aeb107f492bc3/userdata/shm major:0 minor:580 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d7889469ef63ab146c50d169dc4f57ff3c6e05bfe52d83c88832208089809932/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d7889469ef63ab146c50d169dc4f57ff3c6e05bfe52d83c88832208089809932/userdata/shm major:0 minor:119 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d9bf0e017da39714ca0d58a2ba0c46cd89a43ae7f317d13dbb6e31831feeb576/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d9bf0e017da39714ca0d58a2ba0c46cd89a43ae7f317d13dbb6e31831feeb576/userdata/shm major:0 minor:271 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e01c0d4f6330ee155cedce051137a3842f3cbc1b8b4039503e3a3e9fd950bf49/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e01c0d4f6330ee155cedce051137a3842f3cbc1b8b4039503e3a3e9fd950bf49/userdata/shm major:0 minor:1019 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e4a5278beb2dd7685cc80a2eb75df7f2fe99740c2893e28197254b1cb14f8f97/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e4a5278beb2dd7685cc80a2eb75df7f2fe99740c2893e28197254b1cb14f8f97/userdata/shm major:0 minor:809 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e6908445d4f9d29994371a77f0165de1617d0b3d69f7e33acfc73003f26e2111/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e6908445d4f9d29994371a77f0165de1617d0b3d69f7e33acfc73003f26e2111/userdata/shm major:0 minor:657 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e78919d3ec5c9e1fc04085900a692953e2087a6d624466d667eb24bc45d8ddb6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e78919d3ec5c9e1fc04085900a692953e2087a6d624466d667eb24bc45d8ddb6/userdata/shm major:0 minor:97 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e8aeb063908bf0937ac94f698cd72366b310f38d0a1756120e33b67a92cd55de/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e8aeb063908bf0937ac94f698cd72366b310f38d0a1756120e33b67a92cd55de/userdata/shm major:0 minor:533 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e8bcebf454a79198b14303cf41946d1cf832021a30a2591e1b23c6740fca1e9b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e8bcebf454a79198b14303cf41946d1cf832021a30a2591e1b23c6740fca1e9b/userdata/shm major:0 minor:608 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/eab7f63dc5326173ea1e6327285462aa6a81c9b141ac54e3d2487017aec7ef32/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/eab7f63dc5326173ea1e6327285462aa6a81c9b141ac54e3d2487017aec7ef32/userdata/shm major:0 minor:698 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ebe898baef0ea3cf7f17e803722db21b9281248ac2ac1d6fe40d8e59580a9cee/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ebe898baef0ea3cf7f17e803722db21b9281248ac2ac1d6fe40d8e59580a9cee/userdata/shm major:0 minor:646 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ee84c91e209b8d15a57102d97b9b923b7a0a0247657f697f48616f38ce178b0b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ee84c91e209b8d15a57102d97b9b923b7a0a0247657f697f48616f38ce178b0b/userdata/shm major:0 minor:221 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f7a356015607c77d353df6671f85d12adf9e42d7853bd37134503d15b666f482/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f7a356015607c77d353df6671f85d12adf9e42d7853bd37134503d15b666f482/userdata/shm major:0 minor:583 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fadafce6827750abea7ff3c06bd1da6d8ccd788c149ff361b207b48ae0bcefb8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fadafce6827750abea7ff3c06bd1da6d8ccd788c149ff361b207b48ae0bcefb8/userdata/shm major:0 minor:790 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/00dd3703-af25-4e71-b20b-b3e153383489/volumes/kubernetes.io~projected/kube-api-access-k9ddk:{mountpoint:/var/lib/kubelet/pods/00dd3703-af25-4e71-b20b-b3e153383489/volumes/kubernetes.io~projected/kube-api-access-k9ddk major:0 minor:364 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/034cad93-a500-4c58-8d97-fa49866a0d5e/volumes/kubernetes.io~projected/kube-api-access-jptl6:{mountpoint:/var/lib/kubelet/pods/034cad93-a500-4c58-8d97-fa49866a0d5e/volumes/kubernetes.io~projected/kube-api-access-jptl6 major:0 minor:801 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/034cad93-a500-4c58-8d97-fa49866a0d5e/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/034cad93-a500-4c58-8d97-fa49866a0d5e/volumes/kubernetes.io~secret/serving-cert major:0 minor:800 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/09a22c25-6073-4b1a-a029-928452ef37db/volumes/kubernetes.io~projected/kube-api-access-xx4wk:{mountpoint:/var/lib/kubelet/pods/09a22c25-6073-4b1a-a029-928452ef37db/volumes/kubernetes.io~projected/kube-api-access-xx4wk major:0 minor:105 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab/volumes/kubernetes.io~projected/kube-api-access-8v9bx:{mountpoint:/var/lib/kubelet/pods/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab/volumes/kubernetes.io~projected/kube-api-access-8v9bx major:0 minor:804 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert major:0 minor:798 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/12809811-c9df-4e77-8c12-309831b8975d/volumes/kubernetes.io~projected/kube-api-access-bdx6s:{mountpoint:/var/lib/kubelet/pods/12809811-c9df-4e77-8c12-309831b8975d/volumes/kubernetes.io~projected/kube-api-access-bdx6s major:0 minor:993 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/12809811-c9df-4e77-8c12-309831b8975d/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/12809811-c9df-4e77-8c12-309831b8975d/volumes/kubernetes.io~secret/proxy-tls major:0 minor:989 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/163d6a3d-0080-4122-bb7a-17f6e63f00f0/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/163d6a3d-0080-4122-bb7a-17f6e63f00f0/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:243 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/163d6a3d-0080-4122-bb7a-17f6e63f00f0/volumes/kubernetes.io~projected/kube-api-access-m7tc5:{mountpoint:/var/lib/kubelet/pods/163d6a3d-0080-4122-bb7a-17f6e63f00f0/volumes/kubernetes.io~projected/kube-api-access-m7tc5 major:0 minor:269 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/163d6a3d-0080-4122-bb7a-17f6e63f00f0/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/163d6a3d-0080-4122-bb7a-17f6e63f00f0/volumes/kubernetes.io~secret/metrics-tls major:0 minor:434 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1b94d1eb-1b80-4a14-b1c0-d9e192231352/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/1b94d1eb-1b80-4a14-b1c0-d9e192231352/volumes/kubernetes.io~projected/ca-certs major:0 minor:462 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1b94d1eb-1b80-4a14-b1c0-d9e192231352/volumes/kubernetes.io~projected/kube-api-access-srlcl:{mountpoint:/var/lib/kubelet/pods/1b94d1eb-1b80-4a14-b1c0-d9e192231352/volumes/kubernetes.io~projected/kube-api-access-srlcl major:0 minor:463 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1c898657-f06b-44ab-95ff-53a324759ba1/volumes/kubernetes.io~projected/kube-api-access-mt6bf:{mountpoint:/var/lib/kubelet/pods/1c898657-f06b-44ab-95ff-53a324759ba1/volumes/kubernetes.io~projected/kube-api-access-mt6bf major:0 minor:578 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c/volumes/kubernetes.io~projected/kube-api-access-2mxjl:{mountpoint:/var/lib/kubelet/pods/1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c/volumes/kubernetes.io~projected/kube-api-access-2mxjl major:0 minor:577 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c/volumes/kubernetes.io~secret/metrics-tls major:0 minor:567 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2292109e-92a9-4286-858e-dcd2ac083c43/volumes/kubernetes.io~projected/kube-api-access-8rt57:{mountpoint:/var/lib/kubelet/pods/2292109e-92a9-4286-858e-dcd2ac083c43/volumes/kubernetes.io~projected/kube-api-access-8rt57 major:0 minor:218 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/22e10648-af7c-409e-b947-570e7d807e05/volumes/kubernetes.io~projected/kube-api-access-wls49:{mountpoint:/var/lib/kubelet/pods/22e10648-af7c-409e-b947-570e7d807e05/volumes/kubernetes.io~projected/kube-api-access-wls49 major:0 minor:220 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/22e10648-af7c-409e-b947-570e7d807e05/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/22e10648-af7c-409e-b947-570e7d807e05/volumes/kubernetes.io~secret/metrics-tls major:0 minor:436 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/24f71770-714e-4111-9188-ad8663c6baa7/volumes/kubernetes.io~projected/kube-api-access-m287x:{mountpoint:/var/lib/kubelet/pods/24f71770-714e-4111-9188-ad8663c6baa7/volumes/kubernetes.io~projected/kube-api-access-m287x major:0 minor:842 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/24f71770-714e-4111-9188-ad8663c6baa7/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/24f71770-714e-4111-9188-ad8663c6baa7/volumes/kubernetes.io~secret/proxy-tls major:0 minor:837 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2d63d5a8-f45d-4678-824d-5534b2bcd6ca/volumes/kubernetes.io~projected/kube-api-access-kwrd5:{mountpoint:/var/lib/kubelet/pods/2d63d5a8-f45d-4678-824d-5534b2bcd6ca/volumes/kubernetes.io~projected/kube-api-access-kwrd5 major:0 minor:1091 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2d63d5a8-f45d-4678-824d-5534b2bcd6ca/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/2d63d5a8-f45d-4678-824d-5534b2bcd6ca/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config major:0 minor:1089 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2d63d5a8-f45d-4678-824d-5534b2bcd6ca/volumes/kubernetes.io~secret/kube-state-metrics-tls:{mountpoint:/var/lib/kubelet/pods/2d63d5a8-f45d-4678-824d-5534b2bcd6ca/volumes/kubernetes.io~secret/kube-state-metrics-tls major:0 minor:1087 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3053504d-0734-4def-b639-0f5cc2178185/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/3053504d-0734-4def-b639-0f5cc2178185/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3053504d-0734-4def-b639-0f5cc2178185/volumes/kubernetes.io~projected/kube-api-access-2bb2x:{mountpoint:/var/lib/kubelet/pods/3053504d-0734-4def-b639-0f5cc2178185/volumes/kubernetes.io~projected/kube-api-access-2bb2x major:0 minor:127 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3053504d-0734-4def-b639-0f5cc2178185/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/3053504d-0734-4def-b639-0f5cc2178185/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:126 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/376b18a9-5f33-44fd-a37b-20ab02c5e65d/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/376b18a9-5f33-44fd-a37b-20ab02c5e65d/volumes/kubernetes.io~projected/ca-certs major:0 minor:466 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/376b18a9-5f33-44fd-a37b-20ab02c5e65d/volumes/kubernetes.io~projected/kube-api-access-f2hrw:{mountpoint:/var/lib/kubelet/pods/376b18a9-5f33-44fd-a37b-20ab02c5e65d/volumes/kubernetes.io~projected/kube-api-access-f2hrw major:0 minor:467 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/376b18a9-5f33-44fd-a37b-20ab02c5e65d/volumes/kubernetes.io~secret/catalogserver-certs:{mountpoint:/var/lib/kubelet/pods/376b18a9-5f33-44fd-a37b-20ab02c5e65d/volumes/kubernetes.io~secret/catalogserver-certs major:0 minor:473 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/39d3ac31-9259-454b-8e1c-e23024f8f2b2/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/39d3ac31-9259-454b-8e1c-e23024f8f2b2/volumes/kubernetes.io~projected/kube-api-access major:0 minor:259 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/39d3ac31-9259-454b-8e1c-e23024f8f2b2/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/39d3ac31-9259-454b-8e1c-e23024f8f2b2/volumes/kubernetes.io~secret/serving-cert major:0 minor:225 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3c3b0d24-ce5e-49c3-a546-874356f75dc6/volumes/kubernetes.io~projected/kube-api-access-pngsr:{mountpoint:/var/lib/kubelet/pods/3c3b0d24-ce5e-49c3-a546-874356f75dc6/volumes/kubernetes.io~projected/kube-api-access-pngsr major:0 minor:94 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3c3b0d24-ce5e-49c3-a546-874356f75dc6/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/3c3b0d24-ce5e-49c3-a546-874356f75dc6/volumes/kubernetes.io~secret/metrics-tls major:0 minor:43 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4e2c195f-e97d-4cac-81fc-2d5c551d1c30/volumes/kubernetes.io~projected/kube-api-access-kgz7q:{mountpoint:/var/lib/kubelet/pods/4e2c195f-e97d-4cac-81fc-2d5c551d1c30/volumes/kubernetes.io~projected/kube-api-access-kgz7q major:0 minor:242 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/52bdf7cc-f07d-487e-937c-6567f194947e/volumes/kubernetes.io~projected/kube-api-access-8dbmq:{mountpoint:/var/lib/kubelet/pods/52bdf7cc-f07d-487e-937c-6567f194947e/volumes/kubernetes.io~projected/kube-api-access-8dbmq major:0 minor:793 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/52bdf7cc-f07d-487e-937c-6567f194947e/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/52bdf7cc-f07d-487e-937c-6567f194947e/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert major:0 minor:792 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f8c022c-7871-4765-971f-dcafa39357c9/volumes/kubernetes.io~projected/kube-api-access-g997b:{mountpoint:/var/lib/kubelet/pods/5f8c022c-7871-4765-971f-dcafa39357c9/volumes/kubernetes.io~projected/kube-api-access-g997b major:0 minor:1151 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f8c022c-7871-4765-971f-dcafa39357c9/volumes/kubernetes.io~secret/client-ca-bundle:{mountpoint:/var/lib/kubelet/pods/5f8c022c-7871-4765-971f-dcafa39357c9/volumes/kubernetes.io~secret/client-ca-bundle major:0 minor:1145 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f8c022c-7871-4765-971f-dcafa39357c9/volumes/kubernetes.io~secret/secret-metrics-client-certs:{mountpoint:/var/lib/kubelet/pods/5f8c022c-7871-4765-971f-dcafa39357c9/volumes/kubernetes.io~secret/secret-metrics-client-certs major:0 minor:1149 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f8c022c-7871-4765-971f-dcafa39357c9/volumes/kubernetes.io~secret/secret-metrics-server-tls:{mountpoint:/var/lib/kubelet/pods/5f8c022c-7871-4765-971f-dcafa39357c9/volumes/kubernetes.io~secret/secret-metrics-server-tls major:0 minor:1150 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6230ed8f-4608-4168-8f5a-656f411b0ef7/volumes/kubernetes.io~projected/kube-api-access-wzrh8:{mountpoint:/var/lib/kubelet/pods/6230ed8f-4608-4168-8f5a-656f411b0ef7/volumes/kubernetes.io~projected/kube-api-access-wzrh8 major:0 minor:303 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6611e325-6152-480c-9c2c-1b503e49ccd2/volumes/kubernetes.io~projected/kube-api-access-4p4hg:{mountpoint:/var/lib/kubelet/pods/6611e325-6152-480c-9c2c-1b503e49ccd2/volumes/kubernetes.io~projected/kube-api-access-4p4hg major:0 minor:244 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6611e325-6152-480c-9c2c-1b503e49ccd2/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/6611e325-6152-480c-9c2c-1b503e49ccd2/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:226 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/66f88242-8b0b-4790-bbb6-445c19b34ee7/volumes/kubernetes.io~projected/kube-api-access-p5fnx:{mountpoint:/var/lib/kubelet/pods/66f88242-8b0b-4790-bbb6-445c19b34ee7/volumes/kubernetes.io~projected/kube-api-access-p5fnx major:0 minor:245 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/66f88242-8b0b-4790-bbb6-445c19b34ee7/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/66f88242-8b0b-4790-bbb6-445c19b34ee7/volumes/kubernetes.io~secret/serving-cert major:0 minor:231 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9/volumes/kubernetes.io~projected/kube-api-access-79qrb:{mountpoint:/var/lib/kubelet/pods/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9/volumes/kubernetes.io~projected/kube-api-access-79qrb major:0 minor:233 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls:{mountpoint:/var/lib/kubelet/pods/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls major:0 minor:590 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6870ccc7-2094-48d8-9238-f486a4b8d5af/volumes/kubernetes.io~projected/kube-api-access-9dg9r:{mountpoint:/var/lib/kubelet/pods/6870ccc7-2094-48d8-9238-f486a4b8d5af/volumes/kubernetes.io~projected/kube-api-access-9dg9r major:0 minor:1046 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6870ccc7-2094-48d8-9238-f486a4b8d5af/volumes/kubernetes.io~secret/certs:{mountpoint:/var/lib/kubelet/pods/6870ccc7-2094-48d8-9238-f486a4b8d5af/volumes/kubernetes.io~secret/certs major:0 minor:1038 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6870ccc7-2094-48d8-9238-f486a4b8d5af/volumes/kubernetes.io~secret/node-bootstrap-token:{mountpoint:/var/lib/kubelet/pods/6870ccc7-2094-48d8-9238-f486a4b8d5af/volumes/kubernetes.io~secret/node-bootstrap-token major:0 minor:1037 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6d41245b-33d4-40f8-bbe1-6d2247e2e335/volumes/kubernetes.io~projected/kube-api-access-k7bq7:{mountpoint:/var/lib/kubelet/pods/6d41245b-33d4-40f8-bbe1-6d2247e2e335/volumes/kubernetes.io~projected/kube-api-access-k7bq7 major:0 minor:807 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6d41245b-33d4-40f8-bbe1-6d2247e2e335/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/6d41245b-33d4-40f8-bbe1-6d2247e2e335/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:805 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6d41245b-33d4-40f8-bbe1-6d2247e2e335/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/6d41245b-33d4-40f8-bbe1-6d2247e2e335/volumes/kubernetes.io~secret/webhook-cert major:0 minor:806 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6e76fc3f-39a4-4f99-8603-38a94da6ea8e/volumes/kubernetes.io~projected/kube-api-access-5th4l:{mountpoint:/var/lib/kubelet/pods/6e76fc3f-39a4-4f99-8603-38a94da6ea8e/volumes/kubernetes.io~projected/kube-api-access-5th4l major:0 minor:375 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6e76fc3f-39a4-4f99-8603-38a94da6ea8e/volumes/kubernetes.io~secret/signing-key:{mountpoint:/var/lib/kubelet/pods/6e76fc3f-39a4-4f99-8603-38a94da6ea8e/volumes/kubernetes.io~secret/signing-key major:0 minor:374 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/716c2176-50f9-4c4f-af0e-4c7973457df2/volumes/kubernetes.io~projected/kube-api-access-m8bmw:{mountpoint:/var/lib/kubelet/pods/716c2176-50f9-4c4f-af0e-4c7973457df2/volumes/kubernetes.io~projected/kube-api-access-m8bmw major:0 minor:217 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/716c2176-50f9-4c4f-af0e-4c7973457df2/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/716c2176-50f9-4c4f-af0e-4c7973457df2/volumes/kubernetes.io~secret/srv-cert major:0 minor:602 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/732989c5-1b89-46f0-9917-b68613f7f005/volumes/kubernetes.io~projected/kube-api-access-bfvz6:{mountpoint:/var/lib/kubelet/pods/732989c5-1b89-46f0-9917-b68613f7f005/volumes/kubernetes.io~projected/kube-api-access-bfvz6 major:0 minor:216 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/732989c5-1b89-46f0-9917-b68613f7f005/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/732989c5-1b89-46f0-9917-b68613f7f005/volumes/kubernetes.io~secret/serving-cert major:0 minor:213 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/75aedbcd-f6ed-43a1-941b-4b04887ffe8e/volumes/kubernetes.io~projected/kube-api-access-dd6rv:{mountpoint:/var/lib/kubelet/pods/75aedbcd-f6ed-43a1-941b-4b04887ffe8e/volumes/kubernetes.io~projected/kube-api-access-dd6rv major:0 minor:802 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/75aedbcd-f6ed-43a1-941b-4b04887ffe8e/volumes/kubernetes.io~secret/machine-api-operator-tls:{mountpoint:/var/lib/kubelet/pods/75aedbcd-f6ed-43a1-941b-4b04887ffe8e/volumes/kubernetes.io~secret/machine-api-operator-tls major:0 minor:799 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/76cf2b01-33d9-47eb-be5d-44946c78bf20/volumes/kubernetes.io~projected/kube-api-access-nj527:{mountpoint:/var/lib/kubelet/pods/76cf2b01-33d9-47eb-be5d-44946c78bf20/volumes/kubernetes.io~projected/kube-api-access-nj527 major:0 minor:697 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/76cf2b01-33d9-47eb-be5d-44946c78bf20/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/76cf2b01-33d9-47eb-be5d-44946c78bf20/volumes/kubernetes.io~secret/serving-cert major:0 minor:690 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7a51eeaf-1349-4bf3-932d-22ed5ce7c161/volumes/kubernetes.io~projected/kube-api-access-cfxw7:{mountpoint:/var/lib/kubelet/pods/7a51eeaf-1349-4bf3-932d-22ed5ce7c161/volumes/kubernetes.io~projected/kube-api-access-cfxw7 major:0 minor:770 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7a51eeaf-1349-4bf3-932d-22ed5ce7c161/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls:{mountpoint:/var/lib/kubelet/pods/7a51eeaf-1349-4bf3-932d-22ed5ce7c161/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls major:0 minor:761 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8376e1f9-ab05-42d4-aa66-284a167a9bfc/volumes/kubernetes.io~empty-dir/etc-tuned:{mountpoint:/var/lib/kubelet/pods/8376e1f9-ab05-42d4-aa66-284a167a9bfc/volumes/kubernetes.io~empty-dir/etc-tuned major:0 minor:546 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8376e1f9-ab05-42d4-aa66-284a167a9bfc/volumes/kubernetes.io~empty-dir/tmp:{mountpoint:/var/lib/kubelet/pods/8376e1f9-ab05-42d4-aa66-284a167a9bfc/volumes/kubernetes.io~empty-dir/tmp major:0 minor:547 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8376e1f9-ab05-42d4-aa66-284a167a9bfc/volumes/kubernetes.io~projected/kube-api-access-n7784:{mountpoint:/var/lib/kubelet/pods/8376e1f9-ab05-42d4-aa66-284a167a9bfc/volumes/kubernetes.io~projected/kube-api-access-n7784 major:0 minor:548 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8438d015-106b-4aed-ae12-dda781ce51fc/volumes/kubernetes.io~projected/kube-api-access-cqr6w:{mountpoint:/var/lib/kubelet/pods/8438d015-106b-4aed-ae12-dda781ce51fc/volumes/kubernetes.io~projected/kube-api-access-cqr6w major:0 minor:148 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8438d015-106b-4aed-ae12-dda781ce51fc/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/8438d015-106b-4aed-ae12-dda781ce51fc/volumes/kubernetes.io~secret/webhook-cert major:0 minor:140 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7/volumes/kubernetes.io~projected/kube-api-access-7spvn:{mountpoint:/var/lib/kubelet/pods/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7/volumes/kubernetes.io~projected/kube-api-access-7spvn major:0 minor:246 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7/volumes/kubernetes.io~secret/webhook-certs:{mountpoint:/var/lib/kubelet/pods/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7/volumes/kubernetes.io~secret/webhook-certs major:0 minor:502 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76/volumes/kubernetes.io~projected/kube-api-access-dnl28:{mountpoint:/var/lib/kubelet/pods/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76/volumes/kubernetes.io~projected/kube-api-access-dnl28 major:0 minor:252 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76/volumes/kubernetes.io~secret/etcd-client major:0 minor:230 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76/volumes/kubernetes.io~secret/serving-cert major:0 minor:227 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/92e401a4-ed2f-46f7-924b-329d7b313e6a/volumes/kubernetes.io~projected/kube-api-access-c7nhq:{mountpoint:/var/lib/kubelet/pods/92e401a4-ed2f-46f7-924b-329d7b313e6a/volumes/kubernetes.io~projected/kube-api-access-c7nhq major:0 minor:795 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/92e401a4-ed2f-46f7-924b-329d7b313e6a/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/92e401a4-ed2f-46f7-924b-329d7b313e6a/volumes/kubernetes.io~secret/cert major:0 minor:791 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/92e401a4-ed2f-46f7-924b-329d7b313e6a/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls:{mountpoint:/var/lib/kubelet/pods/92e401a4-ed2f-46f7-924b-329d7b313e6a/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls major:0 minor:788 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9778f8f5-b0d1-4967-9776-9db758bba3af/volumes/kubernetes.io~secret/tls-certificates:{mountpoint:/var/lib/kubelet/pods/9778f8f5-b0d1-4967-9776-9db758bba3af/volumes/kubernetes.io~secret/tls-certificates major:0 minor:1006 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9b61ea14-a7ea-49f3-9df4-5655765ddf7c/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/9b61ea14-a7ea-49f3-9df4-5655765ddf7c/volumes/kubernetes.io~projected/kube-api-access major:0 minor:253 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9b61ea14-a7ea-49f3-9df4-5655765ddf7c/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/9b61ea14-a7ea-49f3-9df4-5655765ddf7c/volumes/kubernetes.io~secret/serving-cert major:0 minor:229 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc/volumes/kubernetes.io~projected/kube-api-access-dr788:{mountpoint:/var/lib/kubelet/pods/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc/volumes/kubernetes.io~projected/kube-api-access-dr788 major:0 minor:787 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc/volumes/kubernetes.io~secret/machine-approver-tls:{mountpoint:/var/lib/kubelet/pods/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc/volumes/kubernetes.io~secret/machine-approver-tls major:0 minor:780 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a3ceeece-bee9-4fcb-8517-95ebce38e223/volumes/kubernetes.io~projected/kube-api-access-zrgqb:{mountpoint:/var/lib/kubelet/pods/a3ceeece-bee9-4fcb-8517-95ebce38e223/volumes/kubernetes.io~projected/kube-api-access-zrgqb major:0 minor:255 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a3ceeece-bee9-4fcb-8517-95ebce38e223/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a3ceeece-bee9-4fcb-8517-95ebce38e223/volumes/kubernetes.io~secret/serving-cert major:0 minor:223 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:235 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1/volumes/kubernetes.io~projected/kube-api-access-46m89:{mountpoint:/var/lib/kubelet/pods/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1/volumes/kubernetes.io~projected/kube-api-access-46m89 major:0 minor:239 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1/volumes/kubernetes.io~secret/image-registry-operator-tls:{mountpoint:/var/lib/kubelet/pods/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1/volumes/kubernetes.io~secret/image-registry-operator-tls major:0 minor:433 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/aaaaf539-bf61-44d7-8d47-97535b7aa1ba/volumes/kubernetes.io~projected/kube-api-access-7nfnb:{mountpoint:/var/lib/kubelet/pods/aaaaf539-bf61-44d7-8d47-97535b7aa1ba/volumes/kubernetes.io~projected/kube-api-access-7nfnb major:0 minor:268 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/aaaaf539-bf61-44d7-8d47-97535b7aa1ba/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/aaaaf539-bf61-44d7-8d47-97535b7aa1ba/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:437 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/aaaaf539-bf61-44d7-8d47-97535b7aa1ba/volumes/kubernetes.io~secret/node-tuning-operator-tls:{mountpoint:/var/lib/kubelet/pods/aaaaf539-bf61-44d7-8d47-97535b7aa1ba/volumes/kubernetes.io~secret/node-tuning-operator-tls major:0 minor:432 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ac09dba7-398c-4b0a-a415-edb73cb4cf30/volumes/kubernetes.io~projected/kube-api-access-pbhv4:{mountpoint:/var/lib/kubelet/pods/ac09dba7-398c-4b0a-a415-edb73cb4cf30/volumes/kubernetes.io~projected/kube-api-access-pbhv4 major:0 minor:803 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ac09dba7-398c-4b0a-a415-edb73cb4cf30/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/ac09dba7-398c-4b0a-a415-edb73cb4cf30/volumes/kubernetes.io~secret/cert major:0 minor:797 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d/volumes/kubernetes.io~projected/kube-api-access-qql5t:{mountpoint:/var/lib/kubelet/pods/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d/volumes/kubernetes.io~projected/kube-api-access-qql5t major:0 minor:234 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b1f Mar 19 11:59:26.375284 master-0 kubenswrapper[17644]: a5ba9-eefb-4754-9a2d-b8a5bb328b8d/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d/volumes/kubernetes.io~secret/serving-cert major:0 minor:224 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b3de8a1b-a5be-414f-86e8-738e16c8bc97/volumes/kubernetes.io~projected/kube-api-access-nlr9q:{mountpoint:/var/lib/kubelet/pods/b3de8a1b-a5be-414f-86e8-738e16c8bc97/volumes/kubernetes.io~projected/kube-api-access-nlr9q major:0 minor:270 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b3de8a1b-a5be-414f-86e8-738e16c8bc97/volumes/kubernetes.io~secret/marketplace-operator-metrics:{mountpoint:/var/lib/kubelet/pods/b3de8a1b-a5be-414f-86e8-738e16c8bc97/volumes/kubernetes.io~secret/marketplace-operator-metrics major:0 minor:601 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b5c7eb66-e23e-40df-883c-fed012c07f26/volumes/kubernetes.io~projected/kube-api-access-tx487:{mountpoint:/var/lib/kubelet/pods/b5c7eb66-e23e-40df-883c-fed012c07f26/volumes/kubernetes.io~projected/kube-api-access-tx487 major:0 minor:796 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b5c7eb66-e23e-40df-883c-fed012c07f26/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/b5c7eb66-e23e-40df-883c-fed012c07f26/volumes/kubernetes.io~secret/proxy-tls major:0 minor:789 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bb22a965-9b36-40cd-993d-747a3978be8e/volumes/kubernetes.io~projected/kube-api-access-5p55f:{mountpoint:/var/lib/kubelet/pods/bb22a965-9b36-40cd-993d-747a3978be8e/volumes/kubernetes.io~projected/kube-api-access-5p55f major:0 minor:786 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bb22a965-9b36-40cd-993d-747a3978be8e/volumes/kubernetes.io~secret/samples-operator-tls:{mountpoint:/var/lib/kubelet/pods/bb22a965-9b36-40cd-993d-747a3978be8e/volumes/kubernetes.io~secret/samples-operator-tls major:0 minor:785 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a/volumes/kubernetes.io~projected/kube-api-access-lqcvx:{mountpoint:/var/lib/kubelet/pods/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a/volumes/kubernetes.io~projected/kube-api-access-lqcvx major:0 minor:118 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c8d8a09f-22d5-4f16-84d6-d5f2c504c949/volumes/kubernetes.io~projected/kube-api-access-p5jsb:{mountpoint:/var/lib/kubelet/pods/c8d8a09f-22d5-4f16-84d6-d5f2c504c949/volumes/kubernetes.io~projected/kube-api-access-p5jsb major:0 minor:965 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c8d8a09f-22d5-4f16-84d6-d5f2c504c949/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls:{mountpoint:/var/lib/kubelet/pods/c8d8a09f-22d5-4f16-84d6-d5f2c504c949/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls major:0 minor:934 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103/volumes/kubernetes.io~projected/kube-api-access-hg6sp:{mountpoint:/var/lib/kubelet/pods/cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103/volumes/kubernetes.io~projected/kube-api-access-hg6sp major:0 minor:359 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cf08ab4f-c203-4c16-9826-8cc049f4af31/volumes/kubernetes.io~projected/kube-api-access-lkm97:{mountpoint:/var/lib/kubelet/pods/cf08ab4f-c203-4c16-9826-8cc049f4af31/volumes/kubernetes.io~projected/kube-api-access-lkm97 major:0 minor:214 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cf08ab4f-c203-4c16-9826-8cc049f4af31/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/cf08ab4f-c203-4c16-9826-8cc049f4af31/volumes/kubernetes.io~secret/srv-cert major:0 minor:599 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cf6aab0e-defc-4a4b-8a07-f5af8bf177c4/volumes/kubernetes.io~projected/kube-api-access-894bt:{mountpoint:/var/lib/kubelet/pods/cf6aab0e-defc-4a4b-8a07-f5af8bf177c4/volumes/kubernetes.io~projected/kube-api-access-894bt major:0 minor:337 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d06b230b-db67-4afc-8d10-2c33ad568462/volumes/kubernetes.io~projected/kube-api-access-4bbtl:{mountpoint:/var/lib/kubelet/pods/d06b230b-db67-4afc-8d10-2c33ad568462/volumes/kubernetes.io~projected/kube-api-access-4bbtl major:0 minor:1090 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d06b230b-db67-4afc-8d10-2c33ad568462/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/d06b230b-db67-4afc-8d10-2c33ad568462/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config major:0 minor:1082 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d06b230b-db67-4afc-8d10-2c33ad568462/volumes/kubernetes.io~secret/node-exporter-tls:{mountpoint:/var/lib/kubelet/pods/d06b230b-db67-4afc-8d10-2c33ad568462/volumes/kubernetes.io~secret/node-exporter-tls major:0 minor:1102 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d1eef757-d63a-4708-8efe-7b27eea1ff63/volumes/kubernetes.io~projected/kube-api-access-kbq7n:{mountpoint:/var/lib/kubelet/pods/d1eef757-d63a-4708-8efe-7b27eea1ff63/volumes/kubernetes.io~projected/kube-api-access-kbq7n major:0 minor:79 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d625c81e-01cc-424a-997d-546a5204a72b/volumes/kubernetes.io~projected/kube-api-access-tgzdh:{mountpoint:/var/lib/kubelet/pods/d625c81e-01cc-424a-997d-546a5204a72b/volumes/kubernetes.io~projected/kube-api-access-tgzdh major:0 minor:318 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/daf4dbb6-5a0a-4c92-a930-479a7330ace1/volumes/kubernetes.io~projected/kube-api-access-72jlb:{mountpoint:/var/lib/kubelet/pods/daf4dbb6-5a0a-4c92-a930-479a7330ace1/volumes/kubernetes.io~projected/kube-api-access-72jlb major:0 minor:125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/daf4dbb6-5a0a-4c92-a930-479a7330ace1/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/daf4dbb6-5a0a-4c92-a930-479a7330ace1/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dbcbba74-ac53-4724-a217-4d9b85e7c1db/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/dbcbba74-ac53-4724-a217-4d9b85e7c1db/volumes/kubernetes.io~projected/kube-api-access major:0 minor:238 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dbcbba74-ac53-4724-a217-4d9b85e7c1db/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/dbcbba74-ac53-4724-a217-4d9b85e7c1db/volumes/kubernetes.io~secret/serving-cert major:0 minor:232 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dd6ec279-d92f-45c2-97c2-88b96fbd6600/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/dd6ec279-d92f-45c2-97c2-88b96fbd6600/volumes/kubernetes.io~projected/kube-api-access major:0 minor:435 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dd6ec279-d92f-45c2-97c2-88b96fbd6600/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/dd6ec279-d92f-45c2-97c2-88b96fbd6600/volumes/kubernetes.io~secret/serving-cert major:0 minor:98 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dedf55c4-eeda-4955-aafe-db1fdb8c4a48/volumes/kubernetes.io~projected/kube-api-access-lscpq:{mountpoint:/var/lib/kubelet/pods/dedf55c4-eeda-4955-aafe-db1fdb8c4a48/volumes/kubernetes.io~projected/kube-api-access-lscpq major:0 minor:1092 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dedf55c4-eeda-4955-aafe-db1fdb8c4a48/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/dedf55c4-eeda-4955-aafe-db1fdb8c4a48/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config major:0 minor:1088 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dedf55c4-eeda-4955-aafe-db1fdb8c4a48/volumes/kubernetes.io~secret/openshift-state-metrics-tls:{mountpoint:/var/lib/kubelet/pods/dedf55c4-eeda-4955-aafe-db1fdb8c4a48/volumes/kubernetes.io~secret/openshift-state-metrics-tls major:0 minor:1086 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e2ad29ad-70ef-43c6-91f6-02f04d145673/volumes/kubernetes.io~projected/kube-api-access-trcb7:{mountpoint:/var/lib/kubelet/pods/e2ad29ad-70ef-43c6-91f6-02f04d145673/volumes/kubernetes.io~projected/kube-api-access-trcb7 major:0 minor:1013 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e2ad29ad-70ef-43c6-91f6-02f04d145673/volumes/kubernetes.io~secret/default-certificate:{mountpoint:/var/lib/kubelet/pods/e2ad29ad-70ef-43c6-91f6-02f04d145673/volumes/kubernetes.io~secret/default-certificate major:0 minor:1011 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e2ad29ad-70ef-43c6-91f6-02f04d145673/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/e2ad29ad-70ef-43c6-91f6-02f04d145673/volumes/kubernetes.io~secret/metrics-certs major:0 minor:1012 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e2ad29ad-70ef-43c6-91f6-02f04d145673/volumes/kubernetes.io~secret/stats-auth:{mountpoint:/var/lib/kubelet/pods/e2ad29ad-70ef-43c6-91f6-02f04d145673/volumes/kubernetes.io~secret/stats-auth major:0 minor:1010 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e45616db-f7dd-4a08-847f-abf2759d9fa4/volumes/kubernetes.io~projected/kube-api-access-dvkxx:{mountpoint:/var/lib/kubelet/pods/e45616db-f7dd-4a08-847f-abf2759d9fa4/volumes/kubernetes.io~projected/kube-api-access-dvkxx major:0 minor:649 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e45616db-f7dd-4a08-847f-abf2759d9fa4/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/e45616db-f7dd-4a08-847f-abf2759d9fa4/volumes/kubernetes.io~secret/encryption-config major:0 minor:648 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e45616db-f7dd-4a08-847f-abf2759d9fa4/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/e45616db-f7dd-4a08-847f-abf2759d9fa4/volumes/kubernetes.io~secret/etcd-client major:0 minor:647 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e45616db-f7dd-4a08-847f-abf2759d9fa4/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e45616db-f7dd-4a08-847f-abf2759d9fa4/volumes/kubernetes.io~secret/serving-cert major:0 minor:642 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e48b5aa9-293e-4222-91ff-7640addeca4c/volumes/kubernetes.io~projected/kube-api-access-88ghj:{mountpoint:/var/lib/kubelet/pods/e48b5aa9-293e-4222-91ff-7640addeca4c/volumes/kubernetes.io~projected/kube-api-access-88ghj major:0 minor:472 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e48b5aa9-293e-4222-91ff-7640addeca4c/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/e48b5aa9-293e-4222-91ff-7640addeca4c/volumes/kubernetes.io~secret/encryption-config major:0 minor:430 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e48b5aa9-293e-4222-91ff-7640addeca4c/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/e48b5aa9-293e-4222-91ff-7640addeca4c/volumes/kubernetes.io~secret/etcd-client major:0 minor:460 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e48b5aa9-293e-4222-91ff-7640addeca4c/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e48b5aa9-293e-4222-91ff-7640addeca4c/volumes/kubernetes.io~secret/serving-cert major:0 minor:431 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e5078f17-bc65-460f-9f18-8c506db6840b/volumes/kubernetes.io~projected/kube-api-access-s5rm4:{mountpoint:/var/lib/kubelet/pods/e5078f17-bc65-460f-9f18-8c506db6840b/volumes/kubernetes.io~projected/kube-api-access-s5rm4 major:0 minor:215 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e5078f17-bc65-460f-9f18-8c506db6840b/volumes/kubernetes.io~secret/package-server-manager-serving-cert:{mountpoint:/var/lib/kubelet/pods/e5078f17-bc65-460f-9f18-8c506db6840b/volumes/kubernetes.io~secret/package-server-manager-serving-cert major:0 minor:600 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85/volumes/kubernetes.io~projected/kube-api-access-n5skx:{mountpoint:/var/lib/kubelet/pods/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85/volumes/kubernetes.io~projected/kube-api-access-n5skx major:0 minor:696 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85/volumes/kubernetes.io~secret/serving-cert major:0 minor:695 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e65e2a2f-16b5-44a3-9860-741f70188ab5/volumes/kubernetes.io~projected/kube-api-access-4fvvj:{mountpoint:/var/lib/kubelet/pods/e65e2a2f-16b5-44a3-9860-741f70188ab5/volumes/kubernetes.io~projected/kube-api-access-4fvvj major:0 minor:1014 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf/volumes/kubernetes.io~projected/kube-api-access-nds54:{mountpoint:/var/lib/kubelet/pods/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf/volumes/kubernetes.io~projected/kube-api-access-nds54 major:0 minor:251 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf/volumes/kubernetes.io~secret/serving-cert major:0 minor:228 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f29b11ce-60e0-46b3-8d28-eea3452513cd/volumes/kubernetes.io~projected/kube-api-access-bgs4l:{mountpoint:/var/lib/kubelet/pods/f29b11ce-60e0-46b3-8d28-eea3452513cd/volumes/kubernetes.io~projected/kube-api-access-bgs4l major:0 minor:123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f29b11ce-60e0-46b3-8d28-eea3452513cd/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/f29b11ce-60e0-46b3-8d28-eea3452513cd/volumes/kubernetes.io~secret/metrics-certs major:0 minor:591 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f3b6a8b5-bcaa-47f6-a9d5-6186981191d5/volumes/kubernetes.io~projected/kube-api-access-jdbjk:{mountpoint:/var/lib/kubelet/pods/f3b6a8b5-bcaa-47f6-a9d5-6186981191d5/volumes/kubernetes.io~projected/kube-api-access-jdbjk major:0 minor:349 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f4aad0ff-e6cd-4c43-9561-80a14fee4712/volumes/kubernetes.io~projected/kube-api-access-zndqq:{mountpoint:/var/lib/kubelet/pods/f4aad0ff-e6cd-4c43-9561-80a14fee4712/volumes/kubernetes.io~projected/kube-api-access-zndqq major:0 minor:1060 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f4aad0ff-e6cd-4c43-9561-80a14fee4712/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/f4aad0ff-e6cd-4c43-9561-80a14fee4712/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config major:0 minor:1059 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f4aad0ff-e6cd-4c43-9561-80a14fee4712/volumes/kubernetes.io~secret/prometheus-operator-tls:{mountpoint:/var/lib/kubelet/pods/f4aad0ff-e6cd-4c43-9561-80a14fee4712/volumes/kubernetes.io~secret/prometheus-operator-tls major:0 minor:1066 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f5d73fef-1414-4b29-97ea-42e1c0b1ef18/volumes/kubernetes.io~projected/kube-api-access-v27lg:{mountpoint:/var/lib/kubelet/pods/f5d73fef-1414-4b29-97ea-42e1c0b1ef18/volumes/kubernetes.io~projected/kube-api-access-v27lg major:0 minor:219 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f5d73fef-1414-4b29-97ea-42e1c0b1ef18/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/f5d73fef-1414-4b29-97ea-42e1c0b1ef18/volumes/kubernetes.io~secret/serving-cert major:0 minor:209 fsType:tmpfs blockSize:0} overlay_0-1000:{mountpoint:/var/lib/containers/storage/overlay/c7544baf43a4729fdde76b9439f0536aeadeab7d7860b672d7b72e10d7540f66/merged major:0 minor:1000 fsType:overlay blockSize:0} overlay_0-101:{mountpoint:/var/lib/containers/storage/overlay/bcf70faa811c2d8894b8c7f8cc8e5aaf25df94bbfa45659fa429e361c83870d9/merged major:0 minor:101 fsType:overlay blockSize:0} overlay_0-1021:{mountpoint:/var/lib/containers/storage/overlay/1abb52c5ce8b2e5329f932ba06b41fcb8a9454fce7b42298ff1c29a6fb23852a/merged major:0 minor:1021 fsType:overlay blockSize:0} overlay_0-1023:{mountpoint:/var/lib/containers/storage/overlay/55485cebeaf194e9bf3f0d0e21f0065d67acc07d1a71dab40afafca44d76deaf/merged major:0 minor:1023 fsType:overlay blockSize:0} overlay_0-1025:{mountpoint:/var/lib/containers/storage/overlay/ee1e50efbb4f4e515022d4efe70eadb6626f9f2de1de2bd5a90a96ded6045358/merged major:0 minor:1025 fsType:overlay blockSize:0} overlay_0-1027:{mountpoint:/var/lib/containers/storage/overlay/c3b96cac679cec5a91a5ec5fb309af5b7971eec3a2f90b2fae23bd1a5071708a/merged major:0 minor:1027 fsType:overlay blockSize:0} overlay_0-1029:{mountpoint:/var/lib/containers/storage/overlay/c0f1b367e50eb1e825f5be6f85c16fe49b1edc929ff4241e63a8beed2cbd4407/merged major:0 minor:1029 fsType:overlay blockSize:0} overlay_0-103:{mountpoint:/var/lib/containers/storage/overlay/6c5618531b1c799b895390ee5440a94042807ff01e7eea0c54caf8b1a20dc06e/merged major:0 minor:103 fsType:overlay blockSize:0} overlay_0-1035:{mountpoint:/var/lib/containers/storage/overlay/d03df4a379289e9b3529328615498f9cfd324ee248ee7398cae12a2b3bca2458/merged major:0 minor:1035 fsType:overlay blockSize:0} overlay_0-1049:{mountpoint:/var/lib/containers/storage/overlay/8e652c9920b9f153cbb6d32250fab1752348875d3b5011cea4194130e420625d/merged major:0 minor:1049 fsType:overlay blockSize:0} overlay_0-1051:{mountpoint:/var/lib/containers/storage/overlay/d24c2e69d6067b332a05d5c6e2bc61be3fe34b7221ba84083ea1adb8bcc697b6/merged major:0 minor:1051 fsType:overlay blockSize:0} overlay_0-1069:{mountpoint:/var/lib/containers/storage/overlay/46262d110d6998e3add48d5cb500b45fde9ac165cd0a258cca9dd6fb7410c06f/merged major:0 minor:1069 fsType:overlay blockSize:0} overlay_0-1072:{mountpoint:/var/lib/containers/storage/overlay/b19c57ba58455fec351149824533d96913195b9e1d7f36751415152a5dadcdf4/merged major:0 minor:1072 fsType:overlay blockSize:0} overlay_0-1074:{mountpoint:/var/lib/containers/storage/overlay/e74748499edb439c56221396e7ebaaf87e8d0cb72ea550b0ffbbf1ffef78fb86/merged major:0 minor:1074 fsType:overlay blockSize:0} overlay_0-1076:{mountpoint:/var/lib/containers/storage/overlay/5bf203accded08ff5966bb233b1ac4ea01c70ed9a789798623fff80a0816e5fb/merged major:0 minor:1076 fsType:overlay blockSize:0} overlay_0-109:{mountpoint:/var/lib/containers/storage/overlay/9bea1b829767a4f691739ef5ec26c6edbcf53a734d045c7614506033a3ca6fb4/merged major:0 minor:109 fsType:overlay blockSize:0} overlay_0-1097:{mountpoint:/var/lib/containers/storage/overlay/48249643f4e94b143e61c51eda7c83d36417da0073fe85d867e6b5e603564096/merged major:0 minor:1097 fsType:overlay blockSize:0} overlay_0-1099:{mountpoint:/var/lib/containers/storage/overlay/bb2c51f9b4940d28ed604cc467aa7692937fc53a5f8651a73b03af5b1e283cfe/merged major:0 minor:1099 fsType:overlay blockSize:0} overlay_0-1101:{mountpoint:/var/lib/containers/storage/overlay/e5a05ffa624fdd401ff81a6e32bceb151750017943e7e7c2c6a512ce26b95074/merged major:0 minor:1101 fsType:overlay blockSize:0} overlay_0-1108:{mountpoint:/var/lib/containers/storage/overlay/85e22939a9386b6f24ff9e83b419a20095ea04df276817db356b4282fd42ef31/merged major:0 minor:1108 fsType:overlay blockSize:0} overlay_0-1112:{mountpoint:/var/lib/containers/storage/overlay/2c9d549e5ba80fedfa8a1daa59b426c6eb8ec109e6f54e12e2995e93be2ebb1a/merged major:0 minor:1112 fsType:overlay blockSize:0} overlay_0-1116:{mountpoint:/var/lib/containers/storage/overlay/5b9ad4756ad677c5bb43970c9d393381cbb71d8245142636bd8ddc467d8299ea/merged major:0 minor:1116 fsType:overlay blockSize:0} overlay_0-1119:{mountpoint:/var/lib/containers/storage/overlay/cdcaf1bc935eddf567f770051016ecd44778094de783c1b7130c21e4c5590811/merged major:0 minor:1119 fsType:overlay blockSize:0} overlay_0-1121:{mountpoint:/var/lib/containers/storage/overlay/86c0931759485d76747486cc4e66cfb3afe3d98a1398c05e8c317733a0dc7745/merged major:0 minor:1121 fsType:overlay blockSize:0} overlay_0-1123:{mountpoint:/var/lib/containers/storage/overlay/3c35f5a0a2627263c1bd68e0a0486299c8f4fb0c8163a70d41b673523101fefc/merged major:0 minor:1123 fsType:overlay blockSize:0} overlay_0-1135:{mountpoint:/var/lib/containers/storage/overlay/b2799384e5287669dcae918df8f35c56fa24f7e5e9317ded0429c3637ba7e8da/merged major:0 minor:1135 fsType:overlay blockSize:0} overlay_0-1137:{mountpoint:/var/lib/containers/storage/overlay/fd716751c1ba5fe6d408886f80ab437a411a9694c23a7691b17e66b2d27dd54c/merged major:0 minor:1137 fsType:overlay blockSize:0} overlay_0-1154:{mountpoint:/var/lib/containers/storage/overlay/a0d4c99404d996c2fbed268167beafe5a831036932336fb42fd764c5598b3f38/merged major:0 minor:1154 fsType:overlay blockSize:0} overlay_0-1156:{mountpoint:/var/lib/containers/storage/overlay/3fad77a4553f3d5dafa2216483497cee5172da98dbffb93f370952ecd1503344/merged major:0 minor:1156 fsType:overlay blockSize:0} overlay_0-116:{mountpoint:/var/lib/containers/storage/overlay/5c39a2c50e7698d736a8f7aa9d82d7b1c767bb53a2e526abb9bcf7dec87710fd/merged major:0 minor:116 fsType:overlay blockSize:0} overlay_0-1165:{mountpoint:/var/lib/containers/storage/overlay/1c72decf0132bbc1c87d02c8949e3ba9c627f820aff9c35eb45cbbfbc9418690/merged major:0 minor:1165 fsType:overlay blockSize:0} overlay_0-121:{mountpoint:/var/lib/containers/storage/overlay/5c645afb383fc8074a4c59ffdc9c4a81db7d9b56acb3fe4b583812122ceabd9e/merged major:0 minor:121 fsType:overlay blockSize:0} overlay_0-132:{mountpoint:/var/lib/containers/storage/overlay/574ab10504f42965944c2de9b826f2a3c48da25940ae1a5029f5de0fbd30c41d/merged major:0 minor:132 fsType:overlay blockSize:0} overlay_0-134:{mountpoint:/var/lib/containers/storage/overlay/4d7148e2f461ca7b5d085d69f3008c934c2194c8c6e27eb41226c932eda41d36/merged major:0 minor:134 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/d744d7ba91f54f87fca24f93497c9738c3f4d45695c17724794ab01f2e913065/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-138:{mountpoint:/var/lib/containers/storage/overlay/d30e06da4d0f0e414aaa9ddee59ab19514b095d25b03ae7570b4cc67b6c8a3f8/merged major:0 minor:138 fsType:overlay blockSize:0} overlay_0-150:{mountpoint:/var/lib/containers/storage/overlay/f7cdea35bd3a0c1d454076ac4e45750456617c7c920da7ad2dc2f8c7c3affe44/merged major:0 minor:150 fsType:overlay blockSize:0} overlay_0-152:{mountpoint:/var/lib/containers/storage/overlay/9d96aa4936e3fb6cd07bb60fdf9cd1bc30ff6d7c9a37aef6f94cede82c5e1d5f/merged major:0 minor:152 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/cabcb006068adce359172b64cf71dc757be1889693be5a6acb7042d86531dd99/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-156:{mountpoint:/var/lib/containers/storage/overlay/eea2ab4fa3e63f8fb62aa0b968d5e9f4e75934f9402a1210152af29daf22eaed/merged major:0 minor:156 fsType:overlay blockSize:0} overlay_0-160:{mountpoint:/var/lib/containers/storage/overlay/a48a43b09796d74b11a31d6c53831c9ace6e77765df19016f9bc08d4f9d6817a/merged major:0 minor:160 fsType:overlay blockSize:0} overlay_0-165:{mountpoint:/var/lib/containers/storage/overlay/5d44551ffb5361ef0038f2a5c834c8081fc1d28cca91f3020356d7b9ec601e22/merged major:0 minor:165 fsType:overlay blockSize:0} overlay_0-176:{mountpoint:/var/lib/containers/storage/overlay/dc7459596b41efabb85efa8a4232d0650f2220dd0b42485b9227b721eb903142/merged major:0 minor:176 fsType:overlay blockSize:0} overlay_0-184:{mountpoint:/var/lib/containers/storage/overlay/a4939552f5e4ff970629e6e2dbdfdc01e7c6d473dfc4fedc68a9f917226a78e6/merged major:0 minor:184 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/6205811f38af910b672d6509ddd746f40e600578e003fe0e87ba6f6b37b08a9f/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-194:{mountpoint:/var/lib/containers/storage/overlay/192e8097cd63a2bcb58516710fe65a40087b0721e1713ca7c191d04452bfb462/merged major:0 minor:194 fsType:overlay blockSize:0} overlay_0-199:{mountpoint:/var/lib/containers/storage/overlay/e33cce358066d48b4605417812ab3130253ee6d6a268a012a37cb3108af75a9a/merged major:0 minor:199 fsType:overlay blockSize:0} overlay_0-204:{mountpoint:/var/lib/containers/storage/overlay/41fdb8de4747eea147fdef88bb0244292ee4fe2c57a274177913bcc925d9c778/merged major:0 minor:204 fsType:overlay blockSize:0} overlay_0-247:{mountpoint:/var/lib/containers/storage/overlay/60460309109fa213629ccac84410abe1463026fc9ace85699a0982187139cf85/merged major:0 minor:247 fsType:overlay blockSize:0} overlay_0-257:{mountpoint:/var/lib/containers/storage/overlay/f05abb2f329b589e4e59ece7f196bba601bd9189e171b475bcf714ae2f01f7fb/merged major:0 minor:257 fsType:overlay blockSize:0} overlay_0-264:{mountpoint:/var/lib/containers/storage/overlay/6ac1b2db9b9e9f6953e526b0a759095a6277c2f58374b028d3549888e57e033e/merged major:0 minor:264 fsType:overlay blockSize:0} overlay_0-266:{mountpoint:/var/lib/containers/storage/overlay/d53e4460daa3867366faebedffce96d156dc47dee7cfc1e2a58575b965849830/merged major:0 minor:266 fsType:overlay blockSize:0} overlay_0-273:{mountpoint:/var/lib/containers/storage/overlay/e41954e50b7df848de86ee8ec7a85d8d1957ce444a3ff0b5ffa3d76d4fc00b59/merged major:0 minor:273 fsType:overlay blockSize:0} overlay_0-285:{mountpoint:/var/lib/containers/storage/overlay/118779ca8538f4e3430011d4bc8bb91869cfe242e883e528305ddc2ddebce585/merged major:0 minor:285 fsType:overlay blockSize:0} overlay_0-287:{mountpoint:/var/lib/containers/storage/overlay/87f5516508b5a9fb425ff147a188f8c50995524820e416bc408b215c9391f54b/merged major:0 minor:287 fsType:overlay blockSize:0} overlay_0-289:{mountpoint:/var/lib/containers/storage/overlay/92524f63eb0719d9216a69cc8ff6d41a4c28199044cec7ff060a8507a8235f24/merged major:0 minor:289 fsType:overlay blockSize:0} overlay_0-291:{mountpoint:/var/lib/containers/storage/overlay/41290067078e96f0e73a83864d568f75bb0da2062ea687985842db643540d70b/merged major:0 minor:291 fsType:overlay blockSize:0} overlay_0-293:{mountpoint:/var/lib/containers/storage/overlay/131a0d91350a2fa9177e2706727a37a785ef558d75a753c85c5515977845930c/merged major:0 minor:293 fsType:overlay blockSize:0} overlay_0-295:{mountpoint:/var/lib/containers/storage/overlay/305a17411a200e62d5f3c47734017b210e55850da737ecd47056016c85c5c0aa/merged major:0 minor:295 fsType:overlay blockSize:0} overlay_0-297:{mountpoint:/var/lib/containers/storage/overlay/65152adb709820b652652ceb03d80ddac68c1eb3ed264a5c63058539b0566268/merged major:0 minor:297 fsType:overlay blockSize:0} overlay_0-299:{mountpoint:/var/lib/containers/storage/overlay/9d508a245d78633dbb5414b178957a0e60fc4a1937a644c5d2fb394bff2ae30c/merged major:0 minor:299 fsType:overlay blockSize:0} overlay_0-301:{mountpoint:/var/lib/containers/storage/overlay/5b49897fc3a8c4f71d497894a79ed190feffbf63536294a0810eb9bc61384ae0/merged major:0 minor:301 fsType:overlay blockSize:0} overlay_0-304:{mountpoint:/var/lib/containers/storage/overlay/af98871a0be5cc552fe57fb5e37d1c748fc8a63e09ee8989d11d32d7bec83da8/merged major:0 minor:304 fsType:overlay blockSize:0} overlay_0-306:{mountpoint:/var/lib/containers/storage/overlay/2615190c96bdb87417c0a53ddb489d913f797d581c04f3ba79b649f65dc37988/merged major:0 minor:306 fsType:overlay blockSize:0} overlay_0-310:{mountpoint:/var/lib/containers/storage/overlay/6f93ad9578210f421b8d042b0e30d8594365bd537c5fdbe5ed49d400c14e421d/merged major:0 minor:310 fsType:overlay blockSize:0} overlay_0-312:{mountpoint:/var/lib/containers/storage/overlay/db31cdf96c353784570cdcf358810867aa33f6ff68a615b17648a8c1c1649980/merged major:0 minor:312 fsType:overlay blockSize:0} overlay_0-314:{mountpoint:/var/lib/containers/storage/overlay/e3b199bc6530a943216365bb0456005c5ab4594dd08981441e6c56c25cfab628/merged major:0 minor:314 fsType:overlay blockSize:0} overlay_0-316:{mountpoint:/var/lib/containers/storage/overlay/b48f683bc41b3f56312cf4f50f9abe0f633e61962be5f409fd9e8037604ccae0/merged major:0 minor:316 fsType:overlay blockSize:0} overlay_0-319:{mountpoint:/var/lib/containers/storage/overlay/607f670bd1cc4aa764d659fefb43d60f2d2b4f2c3421f789102d31da1bc88ece/merged major:0 minor:319 fsType:overlay blockSize:0} overlay_0-333:{mountpoint:/var/lib/containers/storage/overlay/9aefd1fcab9bf4b9dc0dcc308db34e2c06fa711aeafb71f5d92567130b8d8895/merged major:0 minor:333 fsType:overlay blockSize:0} overlay_0-342:{mountpoint:/var/lib/containers/storage/overlay/1d22b7f8715e5508c51824beb134bfed8c9766dac69ba729bf6c69a7007720c0/merged major:0 minor:342 fsType:overlay blockSize:0} overlay_0-345:{mountpoint:/var/lib/containers/storage/overlay/98c38035792bb74d542f7a3ae4cc8154a1e7feedca8a331564de853a05d5eac6/merged major:0 minor:345 fsType:overlay blockSize:0} overlay_0-347:{mountpoint:/var/lib/containers/storage/overlay/99bd5415edaed0f806fd2fc154bf02bf8d6567fef3d5a7f4b5cd832f059e39e6/merged major:0 minor:347 fsType:overlay blockSize:0} overlay_0-352:{mountpoint:/var/lib/containers/storage/overlay/7eecae3df13c45084d45e600172d0495dd5b6cdd39b5c5f68395f8e4e69bccc7/merged major:0 minor:352 fsType:overlay blockSize:0} overlay_0-353:{mountpoint:/var/lib/containers/storage/overlay/304f8580b912b9f95a6df569e32a0e5345a4343babb2cd0a8a22bcaffac0e138/merged major:0 minor:353 fsType:overlay blockSize:0} overlay_0-356:{mountpoint:/var/lib/containers/storage/overlay/a850aad3dd77cd14ff9390ff0804ac3f013f86fea6df6f125539fc5e82427637/merged major:0 minor:356 fsType:overlay blockSize:0} overlay_0-361:{mountpoint:/var/lib/containers/storage/overlay/34c0d35d67d6aff58511820db841d13b5429a102e0cdd365a7b87cb98af214c0/merged major:0 minor:361 fsType:overlay blockSize:0} overlay_0-362:{mountpoint:/var/lib/containers/storage/overlay/a7b8357c08cf596095ae578c3252a09596efa19c1ed2e998cdf7b33fc4428706/merged major:0 minor:362 fsType:overlay blockSize:0} overlay_0-366:{mountpoint:/var/lib/containers/storage/overlay/018b0e89048225c9d7bb87814f6c1dd64f1f5af62ddce04d6cd766b79fafa273/merged major:0 minor:366 fsType:overlay blockSize:0} overlay_0-368:{mountpoint:/var/lib/containers/storage/overlay/43d348d8038fff4f30ae8f7349c3a0a79c18daa243c217a6800fa3e6b1e94477/merged major:0 minor:368 fsType:overlay blockSize:0} overlay_0-370:{mountpoint:/var/lib/containers/storage/overlay/993c5785f5741f7d3ea535e4a6724696cc8c6c3bdeba48f360858c1b7de779db/merged major:0 minor:370 fsType:overlay blockSize:0} overlay_0-378:{mountpoint:/var/lib/containers/storage/overlay/af367cc19bdfb4a23e30d28719ce4dc423b2f3f77937a6bd5ec57d8fb81fc63c/merged major:0 minor:378 fsType:overlay blockSize:0} overlay_0-403:{mountpoint:/var/lib/containers/storage/overlay/15ab5a11b77a10d82908fa52af09f4df3e7651ec84af3c2540748d0654e8fe54/merged major:0 minor:403 fsType:overlay blockSize:0} overlay_0-405:{mountpoint:/var/lib/containers/storage/overlay/11f597bd4099308d897bd353bdce2aa1133002bdfad1f379f9c38ed84ffa8bf9/merged major:0 minor:405 fsType:overlay blockSize:0} overlay_0-409:{mountpoint:/var/lib/containers/storage/overlay/c9622732382bed15d781f83a3f6ce746545992f0dbbd20dab2103834eb9a9856/merged major:0 minor:409 fsType:overlay blockSize:0} overlay_0-411:{mountpoint:/var/lib/containers/storage/overlay/6a8d9eaf7400c4851258a3b1d734f882d25a6804fe2c8e77965607ae5acb1660/merged major:0 minor:411 fsType:overlay blockSize:0} overlay_0-416:{mountpoint:/var/lib/containers/storage/overlay/229cb6a247f0beaf2823638e5011138cd5dae404bd58025d1c74d91c32cb92da/merged major:0 minor:416 fsType:overlay blockSize:0} overlay_0-44:{mountpoint:/var/lib/containers/storage/overlay/e9e30509c242a04163aa2fd2dd37213dbc6fc203316c4ad0b34e15b260a6966a/merged major:0 minor:44 fsType:overlay blockSize:0} overlay_0-448:{mountpoint:/var/lib/containers/storage/overlay/584c5b4697cb6bbc20144ff95a373bbf764ec34f1b0dca0b6d0c45923162c3da/merged major:0 minor:448 fsType:overlay blockSize:0} overlay_0-450:{mountpoint:/var/lib/containers/storage/overlay/3fd42ab764ba5dfd1dcddcfd2922650e4740fdf3fdc70548e3e9f65f409cea5b/merged major:0 minor:450 fsType:overlay blockSize:0} overlay_0-452:{mountpoint:/var/lib/containers/storage/overlay/930a32073e1a243da5e20f5df65f79e89ef0a855b934f7c5c6c8e38a1e45914d/merged major:0 minor:452 fsType:overlay blockSize:0} overlay_0-454:{mountpoint:/var/lib/containers/storage/overlay/bd9400d4cf0e08a0c4e0f0f21db2d692fbd4d28dc61ac578eae6b4ef5e399ea4/merged major:0 minor:454 fsType:overlay blockSize:0} overlay_0-456:{mountpoint:/var/lib/containers/storage/overlay/2ab4ee896197d661429be0783e7c81c2eb195109d77d93235867b0fc07c25171/merged major:0 minor:456 fsType:overlay blockSize:0} overlay_0-458:{mountpoint:/var/lib/containers/storage/overlay/7906348c750f28acaa26989f6fdbaad20df25f29883a1b011b862b0c4a7b011a/merged major:0 minor:458 fsType:overlay blockSize:0} overlay_0-469:{mountpoint:/var/lib/containers/storage/overlay/51da1d3176afe14489667e8a77ad1d91db0e6ed054f7e7bbcbf6bfb6db303846/merged major:0 minor:469 fsType:overlay blockSize:0} overlay_0-471:{mountpoint:/var/lib/containers/storage/overlay/defe81ac187bc93408757b550d643e4b9de436584c79b90e96aa852197e9ba16/merged major:0 minor:471 fsType:overlay blockSize:0} overlay_0-476:{mountpoint:/var/lib/containers/storage/overlay/3bf4fa5686270fa3564cdbb984c4fe2c4791be61cc13845eddf3483c328fdf8b/merged major:0 minor:476 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/a70e0382f07837be5270314eb686264885a5f0c090da96da41f6fbea5f5bbe60/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-482:{mountpoint:/var/lib/containers/storage/overlay/7bd5b3a7893df4a7b1b6cebd7ee03905939b35c152a1cbf5c32afc17c3b21f47/merged major:0 minor:482 fsType:overlay blockSize:0} overlay_0-484:{mountpoint:/var/lib/containers/storage/overlay/5b8f2583129fd369276b87f8cc2965ddc0e2bb3517b6a19cd6ba87ed18fc0eb2/merged major:0 minor:484 fsType:overlay blockSize:0} overlay_0-492:{mountpoint:/var/lib/containers/storage/overlay/26c3e7c53ef45859e3e46398cbad1c818d0903cdc1c12f550a2905cc1c9ad9d3/merged major:0 minor:492 fsType:overlay blockSize:0} overlay_0-494:{mountpoint:/var/lib/containers/storage/overlay/04c65d6a167e44061f2bc7b3f2ec11ea1bfd9e7f1a14340802e7aa4175eacf17/merged major:0 minor:494 fsType:overlay blockSize:0} overlay_0-501:{mountpoint:/var/lib/containers/storage/overlay/ef3b0caeba6babb65c54907c02862f26725351ae531af6079e6f12e071c8ccfd/merged major:0 minor:501 fsType:overlay blockSize:0} overlay_0-512:{mountpoint:/var/lib/containers/storage/overlay/3266c31011a1f503372c0cf5d79ec055b72ea0b677175db94db15ea599b5af22/merged major:0 minor:512 fsType:overlay blockSize:0} overlay_0-513:{mountpoint:/var/lib/containers/storage/overlay/0c60560ada11bc32f8c9573f5ea89072f0914e7124c3d5b80985ce7a2e9e87a2/merged major:0 minor:513 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/9ef51aee0cd3466360df38b51e0a8808e8f2aafc5d7443e6a7afef98fa7d9883/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-524:{mountpoint:/var/lib/containers/storage/overlay/d9aa5943540b5aae5a90db483ab5cc0e8f1a0c528f6bb9b9aaae9d83630827d1/merged major:0 minor:524 fsType:overlay blockSize:0} overlay_0-528:{mountpoint:/var/lib/containers/storage/overlay/313d8911413f5898febe3fe1f5e83bc8f1c092a98150f8e073904ce73d359e97/merged major:0 minor:528 fsType:overlay blockSize:0} overlay_0-530:{mountpoint:/var/lib/containers/storage/overlay/4b04bd5327331de45bdbc9a1cc3053918715b13d3de7d60e187216a94f1c6ae0/merged major:0 minor:530 fsType:overlay blockSize:0} overlay_0-534:{mountpoint:/var/lib/containers/storage/overlay/103dc7b48f15cd84e8f83f3ebb67ae3dc779fd60bf4e0882eba3d6e74667f493/merged major:0 minor:534 fsType:overlay blockSize:0} overlay_0-538:{mountpoint:/var/lib/containers/storage/overlay/b89b8e7cfa9c025c98bdbe51729ae26303c26bfbcf22fc9871ca941bc7fb1306/merged major:0 minor:538 fsType:overlay blockSize:0} overlay_0-541:{mountpoint:/var/lib/containers/storage/overlay/382e8c84c1d521ae0a8dac855cccc7daa2fbb2679580782fe42ee1ef2807936a/merged major:0 minor:541 fsType:overlay blockSize:0} overlay_0-552:{mountpoint:/var/lib/containers/storage/overlay/4ecc7c862a111897604337a9ebfe854f5adadde8efdbbda246d91171f50e9802/merged major:0 minor:552 fsType:overlay blockSize:0} overlay_0-554:{mountpoint:/var/lib/containers/storage/overlay/891b766a6bcf1b6fdb51af854bff37cb8b28ac44fcac7a8d9feaa634c48a6cd5/merged major:0 minor:554 fsType:overlay blockSize:0} overlay_0-555:{mountpoint:/var/lib/containers/storage/overlay/4e1633da8f96733e6fe40d8767c893eb49e73aa92c7dcf5be74b0facb6907863/merged major:0 minor:555 fsType:overlay blockSize:0} overlay_0-558:{mountpoint:/var/lib/containers/storage/overlay/2ba5fd31305e85b8a703af4f7b7a46c5673a04fa557482a4c710e7d499375613/merged major:0 minor:558 fsType:overlay blockSize:0} overlay_0-564:{mountpoint:/var/lib/containers/storage/overlay/48255a7857a0efa9e9de93553f9cb2fe0aaf3173839d4d9d2d1e025d4f32d05c/merged major:0 minor:564 fsType:overlay blockSize:0} overlay_0-572:{mountpoint:/var/lib/containers/storage/overlay/7f871a30e6904e933ed0d1f4c7e8ef14df8a69d2958e9a8e96a02e627629169d/merged major:0 minor:572 fsType:overlay blockSize:0} overlay_0-579:{mountpoint:/var/lib/containers/storage/overlay/bb7d7a82904ef53777a826fd809a707969f159aef5035464d5fc6c297ff97d17/merged major:0 minor:579 fsType:overlay blockSize:0} overlay_0-58:{mountpoint:/var/lib/containers/storage/overlay/6c788c89eb5bb5f3fa48926b5a55e5c5f843ac1f7887013b2a609d3645593d9e/merged major:0 minor:58 fsType:overlay blockSize:0} overlay_0-585:{mountpoint:/var/lib/containers/storage/overlay/48abe2b9ff2213e79be46f092ccc52eeb43b97e07f0fb6a9b07f22e2706d1d10/merged major:0 minor:585 fsType:overlay blockSize:0} overlay_0-587:{mountpoint:/var/lib/containers/storage/overlay/d9449c123237555b57cf3f2471e9fb692a39d2386047529064d96df812758111/merged major:0 minor:587 fsType:overlay blockSize:0} overlay_0-592:{mountpoint:/var/lib/containers/storage/overlay/3073e0192cce4c8d8607070b6cc6bbb1e0a6e60f3f333c519cc95f2ceeb56364/merged major:0 minor:592 fsType:overlay blockSize:0} overlay_0-596:{mountpoint:/var/lib/containers/storage/overlay/ed09d2e8c225abc88422ff524cc015d208c224fe2cd48db89f2b2e2135ab31ea/merged major:0 minor:596 fsType:overlay blockSize:0} overlay_0-603:{mountpoint:/var/lib/containers/storage/overlay/14cdc2feabe9a732574bae285a52fd205698f9b2677a9451a4a072db63ffe210/merged major:0 minor:603 fsType:overlay blockSize:0} overlay_0-605:{mountpoint:/var/lib/containers/storage/overlay/780f6db6f4f0efdc1877edde106a6117fa4742e8e1dc6a16f82a85ab9b3dd82a/merged major:0 minor:605 fsType:overlay blockSize:0} overlay_0-611:{mountpoint:/var/lib/containers/storage/overlay/2d9e14391c3d76890a42470f3afa131b59489ca0ebbb884a321997ee626bda40/merged major:0 minor:611 fsType:overlay blockSize:0} overlay_0-615:{mountpoint:/var/lib/containers/storage/overlay/1bc4b623079cb22561feb7dff569f1fce1b56af8b68b81fd253529bfbad96d01/merged major:0 minor:615 fsType:overlay blockSize:0} overlay_0-621:{mountpoint:/var/lib/containers/storage/overlay/0e5c23a94ae23e680c6310b28ea74ca309a2c6f85c26a448f9d96f099d3f9e6b/merged major:0 minor:621 fsType:overlay blockSize:0} overlay_0-633:{mountpoint:/var/lib/containers/storage/overlay/0b9e9fb6a174a71296ab94a244aefb83f694788c348d7af0f3fde54889b066df/merged major:0 minor:633 fsType:overlay blockSize:0} overlay_0-644:{mountpoint:/var/lib/containers/storage/overlay/5cc775d1eb5b35b2836a754e26c18843e5b88bb5e741c9d03f0bcbb167ab2ddf/merged major:0 minor:644 fsType:overlay blockSize:0} overlay_0-65:{mountpoint:/var/lib/containers/storage/overlay/ba9ae18b7996e50b834b2539e45940f2f546ecc5da05febefc3547bb0914be18/merged major:0 minor:65 fsType:overlay blockSize:0} overlay_0-652:{mountpoint:/var/lib/containers/storage/overlay/f11a70c659a1eea92926d71b4bf82f93bc05f9ab69535475c99731f551b7d146/merged major:0 minor:652 fsType:overlay blockSize:0} overlay_0-654:{mountpoint:/var/lib/containers/storage/overlay/004bc814820deaf7dbc463b6ef9c55c100b972a33055ac392b403e6ab5f7666f/merged major:0 minor:654 fsType:overlay blockSize:0} overlay_0-658:{mountpoint:/var/lib/containers/storage/overlay/32fb905dab3f3b686d518e4527a01ac2fe4307c65b8b2ef7c4cdcd169b0abab2/merged major:0 minor:658 fsType:overlay blockSize:0} overlay_0-668:{mountpoint:/var/lib/containers/storage/overlay/179f4b0ffbc85199b6b44019f485fcc995283b8d370c7591a8f51674df03c526/merged major:0 minor:668 fsType:overlay blockSize:0} overlay_0-67:{mountpoint:/var/lib/containers/storage/overlay/a59fbf72c84c99ba83f9ab554a5d402ae49b415ee8db7ccb4bf1f04e1c57f67b/merged major:0 minor:67 fsType:overlay blockSize:0} overlay_0-678:{mountpoint:/var/lib/containers/storage/overlay/9840e2a040559c474f0953ce4e15e425bd23c874774a1cd780c55a6bcb5bc02f/merged major:0 minor:678 fsType:overlay blockSize:0} overlay_0-681:{mountpoint:/var/lib/containers/storage/overlay/9e05d3245bc31607828f5cd1f8defcc49b70b66c4433216496187b3c65a76197/merged major:0 minor:681 fsType:overlay blockSize:0} overlay_0-686:{mountpoint:/var/lib/containers/storage/overlay/fc2d7afb017d9ef8e4b230fed22e5f51ce4adb83c91bc178e4a45406b1b6d08b/merged major:0 minor:686 fsType:overlay blockSize:0} overlay_0-69:{mountpoint:/var/lib/containers/storage/overlay/5ea836555c20f4adf7ef54ab436fbea1affce9fb9e82b015215b1c9650c70010/merged major:0 minor:69 fsType:overlay blockSize:0} overlay_0-691:{mountpoint:/var/lib/containers/storage/overlay/ade8044d4aab1430a50f404ed6a2237e224b57b9161fe357530c6d15f1503747/merged major:0 minor:691 fsType:overlay blockSize:0} overlay_0-693:{mountpoint:/var/lib/containers/storage/overlay/d4695ba2fd058e71a9df27d799afa03e870f4c1a1e6cbc362b85e98e0a9a6fca/merged major:0 minor:693 fsType:overlay blockSize:0} overlay_0-702:{mountpoint:/var/lib/containers/storage/overlay/3babec5e10ad03fb63cdaa35003eb2d7f2ad9b0bdacb1d0c2c04d6462eceef43/merged major:0 minor:702 fsType:overlay blockSize:0} overlay_0-704:{mountpoint:/var/lib/containers/storage/overlay/e36e66e74d2acfb344b06ce32cbcf586964eb5d354da52e03dad47c9bdaee241/merged major:0 minor:704 fsType:overlay blockSize:0} overlay_0-707:{mountpoint:/var/lib/containers/storage/overlay/8adb0a3d36bc07fd8d133b1787961724164411e09b6963bff373d93333023bd0/merged major:0 minor:707 fsType:overlay blockSize:0} overlay_0-708:{mountpoint:/var/lib/containers/storage/overlay/8262851ad6a4ee2b313c45dfd6b87e3d2d593bbdd23d8266fb69d8815559df7e/merged major:0 minor:708 fsType:overlay blockSize:0} overlay_0-71:{mountpoint:/var/lib/containers/storage/overlay/a0020eca8a07dbabab06f7c181e4fa2fa2296512866c19e0d563da3edbf98e57/merged major:0 minor:71 fsType:overlay blockSize:0} overlay_0-713:{mountpoint:/var/lib/containers/storage/overlay/6aa1a64384d43b9da6f13bb200c0e291371e68d20dffb42bb3de251718d1a26a/merged major:0 minor:713 fsType:overlay blockSize:0} overlay_0-720:{mountpoint:/var/lib/containers/storage/overlay/74ae94b105cd30d37b450e997768210b7e7f2fbcf8a0e344a06ee1074a41a538/merged major:0 minor:720 fsType:overlay blockSize:0} overlay_0-722:{mountpoint:/var/lib/containers/storage/overlay/51ae28a399ad087899073af5e8522bdee25e7733eda127a694ab8175189991c9/merged major:0 minor:722 fsType:overlay blockSize:0} overlay_0-727:{mountpoint:/var/lib/containers/storage/overlay/9e5caeb8a13112b70e169c5fb9a03ec1ccad74138e1495fcf37a50b4f90aa768/merged major:0 minor:727 fsType:overlay blockSize:0} overlay_0-734:{mountpoint:/var/lib/containers/storage/overlay/438e3b7c7f2f49a160efa1c43da3ec3891500c5f22846d072a710d16090c1a8b/merged major:0 minor:734 fsType:overlay blockSize:0} overlay_0-738:{mountpoint:/var/lib/containers/storage/overlay/e7fe016aeb4a4b647277a6b35b98456c0b32170e9fc608bcdb47c29ec97a1531/merged major:0 minor:738 fsType:overlay blockSize:0} overlay_0-743:{mountpoint:/var/lib/containers/storage/overlay/3afb9ae3c75f57f53eee53681777e0604531c522ead44ccc20cb84f02f4f3a0e/merged major:0 minor:743 fsType:overlay blockSize:0} overlay_0-77:{mountpoint:/var/lib/containers/storage/overlay/aaa91b7f119b61254915e57c669a0f39fdcc04ace131312cecf0dead22673235/merged major:0 minor:77 fsType:overlay blockSize:0} overlay_0-778:{mountpoint:/var/lib/containers/storage/overlay/a930f2ce76f763cc58b1758eb1d8f7940b4b70c279037ef3b641b47f1efa3adc/merged major:0 minor:778 fsType:overlay blockSize:0} overlay_0-811:{mountpoint:/var/lib/containers/storage/overlay/039def0319d0f872198ceaa120b9763a887ddef2f5d2cfec8d2e00e9e7f41464/merged major:0 minor:811 fsType:overlay blockSize:0} overlay_0-82:{mountpoint:/var/lib/containers/storage/overlay/1bb6fffbcd28a236be330a461e3d5a3036084480d2dbca26cf095af2ac7a9a0e/merged major:0 minor:82 fsType:overlay blockSize:0} overlay_0-821:{mountpoint:/var/lib/containers/storage/overlay/8dcef073b72427fa17cd46c1b0c93a743b190d11c8992d74a893f8dc62bae658/merged major:0 minor:821 fsType:overlay blockSize:0} overlay_0-828:{mountpoint:/var/lib/containers/storage/overlay/47ed46a27f4ba92bd766d8ba04c0417ef0486579d07a2953bfab4f03c3b69b7e/merged major:0 minor:828 fsType:overlay blockSize:0} overlay_0-830:{mountpoint:/var/lib/containers/storage/overlay/482239b8abbc784cbac50132dc9e3aa740b91c9bd86433ceb04578e502774979/merged major:0 minor:830 fsType:overlay blockSize:0} overlay_0-833:{mountpoint:/var/lib/containers/storage/overlay/553be9c1f4157e9adc7d9684007600223abcc0df6ebe982d3832d72fcce696a9/merged major:0 minor:833 fsType:overlay blockSize:0} overlay_0-836:{mountpoint:/var/lib/containers/storage/overlay/24ca425425ffad5715e68fe163135fdeaba1c0b6f8cd7454d82f8c2f062877d2/merged major:0 minor:836 fsType:overlay blockSize:0} overlay_0-84:{mountpoint:/var/lib/containers/storage/overlay/b06fbabf02175130808b95f0811c9a0218b406b2ff1c41d2542f77ffda5633ef/merged major:0 minor:84 fsType:overlay blockSize:0} overlay_0-844:{mountpoint:/var/lib/containers/storage/overlay/4d303df6a1c9d09712faa0629be1581aed9b1965957426f7fbe1daa1ed85f19e/merged major:0 minor:844 fsType:overlay blockSize:0} overlay_0-847:{mountpoint:/var/lib/containers/storage/overlay/66bb061622e3eb1f690fbb25eebeffd736fe0e4436a5f20f4ae98f9279b2c6d1/merged major:0 minor:847 fsType:overlay blockSize:0} overlay_0-852:{mountpoint:/var/lib/containers/storage/overlay/909dd1375211b38846580d0655d98720a2b98c3d1c9282e387aef8291b175e98/merged major:0 minor:852 fsType:overlay blockSize:0} overlay_0-855:{mountpoint:/var/lib/containers/storage/overlay/eb45941751da58102c6a6ff3983c03f27269678e2302b5b5b69429655dfb2baf/merged major:0 minor:855 fsType:overlay blockSize:0} overlay_0-857:{mountpoint:/var/lib/containers/storage/overlay/a359ff9aaa86c87c3e818b51f2d9ae9ba8b94f4ae43fee3d9caf8fa9d1c65999/merged major:0 minor:857 fsType:overlay blockSize:0} overlay_0-859:{mountpoint:/var/lib/containers/storage/overlay/02651b36d738e76f063c419c635b917b8e1493aa08d5c8254c3a6f570a5bbe06/merged major:0 minor:859 fsType:overlay blockSize:0} overlay_0-861:{mountpoint:/var/lib/containers/storage/overlay/d51c00891edaa2172472143235e4e06d681647ebe74ee96e824806fa7f1b1180/merged major:0 minor:861 fsType:overlay blockSize:0} overlay_0-863:{mountpoint:/var/lib/containers/storage/overlay/95c01d5d4d8bf273eb25a701abb8460b4296e650c90f358d05df4c5b3fce6fc8/merged major:0 minor:863 fsType:overlay blockSize:0} overlay_0-873:{mountpoint:/var/lib/containers/storage/overlay/173215e11b03065d3043278a8a673e7ec28157f6a53c09ca20032d9b8fb9b6e9/merged major:0 minor:873 fsType:overlay blockSize:0} overlay_0-875:{mountpoint:/var/lib/containers/storage/overlay/e143b9fb1d3e917c335ca8e8ec590f06d7c6d585733cbe61ad9cae88f2607331/merged major:0 minor:875 fsType:overlay blockSize:0} overlay_0-877:{mountpoint:/var/lib/containers/storage/overlay/e0c1e0dcbe799731ffb189b6d8901fc7d21027a58618991556f6953f66d73c6b/merged major:0 minor:877 fsType:overlay blockSize:0} overlay_0-879:{mountpoint:/var/lib/containers/storage/overlay/3213abdde4eb841105beaccd7fdee345c11defc62bb72c358e29343473389f51/merged major:0 minor:879 fsType:overlay blockSize:0} overlay_0-881:{mountpoint:/var/lib/containers/storage/overlay/fa84e6cccc34fc250c0469a7c1b8b636cab9818ec84730febbcc890a1f1ee99c/merged major:0 minor:881 fsType:overlay blockSize:0} overlay_0-883:{mountpoint:/var/lib/containers/storage/overlay/f137ca46acf6452b6dce528c7899053228573b9a852023181e2254563b2cbe80/merged major:0 minor:883 fsType:overlay blockSize:0} overlay_0-885:{mountpoint:/var/lib/containers/storage/overlay/d56a6a7bbc4e9c4646f37574072227baf8518aa4224389a2762a9fa64d8ab9ea/merged major:0 minor:885 fsType:overlay blockSize:0} overlay_0-887:{mountpoint:/var/lib/containers/storage/overlay/e6eebaed81e6003faaff5d077223da02b3091616ac9eacbf58e196d55dcf7614/merged major:0 minor:887 fsType:overlay blockSize:0} overlay_0-909:{mountpoint:/var/lib/containers/storage/overlay/4cda34cf699d58d63466683a5f3aaa0dedf7d6128100534facab0d37158f5c8d/merged major:0 minor:909 fsType:overlay blockSize:0} overlay_0-910:{mountpoint:/var/lib/containers/storage/overlay/942382df662efc66aaf6e3a34c1250d3b62cb3940ea4ccbc8b4da05cbef66287/merged major:0 minor:910 fsType:overlay blockSize:0} overlay_0-923:{mountpoint:/var/lib/containers/storage/overlay/edc88f4dd532a0766c0b202f5957d1b0d6542b38e902d75b5dff7f8316140ed1/merged major:0 minor:923 fsType:overlay blockSize:0} overlay_0-925:{mountpoint:/var/lib/containers/storage/overlay/6ed319e997f3f8d90949ae477d0cc63017234740c1242809f1a472f66f41dccb/merged major:0 minor:925 fsType:overlay blockSize:0} overlay_0-931:{mountpoint:/var/lib/containers/storage/overlay/493a0ac6910875c26674ed1d634c7b972532d86b8452444268f9d302b45f140b/merged major:0 minor:931 fsType:overlay blockSize:0} overlay_0-935:{mountpoint:/var/lib/containers/storage/overlay/f68873b1c2a5801795af0a514779e10b21c27e30766bfc0f073a94619eeb4c8a/merged major:0 minor:935 fsType:overlay blockSize:0} overlay_0-937:{mountpoint:/var/lib/containers/storage/overlay/81e8f64f3a4e8d62436f045ced32771909ba681bbeca2f83f83fcb5576306242/merged major:0 minor:937 fsType:overlay blockSize:0} overlay_0-95:{mountpoint:/var/lib/containers/storage/overlay/71e58b7e Mar 19 11:59:26.375643 master-0 kubenswrapper[17644]: 0ebeae40541f90adf20e88bbafb48acabb2e23ccbee81ec210b7d9a0/merged major:0 minor:95 fsType:overlay blockSize:0} overlay_0-954:{mountpoint:/var/lib/containers/storage/overlay/5905edc8365e72e90bd864423a1af8e413bf3c9fa1f841b52a71d7d9a644c791/merged major:0 minor:954 fsType:overlay blockSize:0} overlay_0-955:{mountpoint:/var/lib/containers/storage/overlay/8a285e5dbab25200438741435cbdb0bd2ca60bf02d0200a84c5e4c2b72430fea/merged major:0 minor:955 fsType:overlay blockSize:0} overlay_0-967:{mountpoint:/var/lib/containers/storage/overlay/0ddf73025f323bf0a9ee23fbabdb171166e926203a768d0edcd2c44a022c6862/merged major:0 minor:967 fsType:overlay blockSize:0} overlay_0-975:{mountpoint:/var/lib/containers/storage/overlay/a12db7b83bbd40b6d69056545126ede8202bf5a3214f1fb4e5e2140614b32502/merged major:0 minor:975 fsType:overlay blockSize:0} overlay_0-977:{mountpoint:/var/lib/containers/storage/overlay/b789af52e142557c8bcfbaeb110f0d0f3ec2932fa243f47d4e33464d0255f500/merged major:0 minor:977 fsType:overlay blockSize:0} overlay_0-979:{mountpoint:/var/lib/containers/storage/overlay/d13b7f6cb3aa5a45b4f3ff1da028d650ddfa59d7c1f20d897294ffe2a4d801d4/merged major:0 minor:979 fsType:overlay blockSize:0} overlay_0-984:{mountpoint:/var/lib/containers/storage/overlay/1f262751ae8497a932fd32b88bef2af36a609c8994d7349ca2f8602b93e450cf/merged major:0 minor:984 fsType:overlay blockSize:0} overlay_0-996:{mountpoint:/var/lib/containers/storage/overlay/822673e8088ce67020e05eb80155aa4f756c1247a5d99527282d41b19d61abe3/merged major:0 minor:996 fsType:overlay blockSize:0} overlay_0-998:{mountpoint:/var/lib/containers/storage/overlay/4dda84b0759194ea9876c330d01d8ce4bb9949a364fcc7d45bc1f924ecbefcf7/merged major:0 minor:998 fsType:overlay blockSize:0}] Mar 19 11:59:26.413994 master-0 kubenswrapper[17644]: I0319 11:59:26.412285 17644 manager.go:217] Machine: {Timestamp:2026-03-19 11:59:26.411134803 +0000 UTC m=+0.181092858 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:7514b5d6ada747ba9a1e5c7e73d4e6d3 SystemUUID:7514b5d6-ada7-47ba-9a1e-5c7e73d4e6d3 BootID:bab7eb38-7ae5-4f9e-8147-39f837056abe Filesystems:[{Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-592 DeviceMajor:0 DeviceMinor:592 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1035 DeviceMajor:0 DeviceMinor:1035 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7444b7503d7740b7e0cd43f84f6cce1196456b0e8df5c1dc67a1f73e2797cf61/userdata/shm DeviceMajor:0 DeviceMinor:240 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f4aad0ff-e6cd-4c43-9561-80a14fee4712/volumes/kubernetes.io~secret/prometheus-operator-tls DeviceMajor:0 DeviceMinor:1066 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/aaaaf539-bf61-44d7-8d47-97535b7aa1ba/volumes/kubernetes.io~projected/kube-api-access-7nfnb DeviceMajor:0 DeviceMinor:268 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/732989c5-1b89-46f0-9917-b68613f7f005/volumes/kubernetes.io~projected/kube-api-access-bfvz6 DeviceMajor:0 DeviceMinor:216 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/8376e1f9-ab05-42d4-aa66-284a167a9bfc/volumes/kubernetes.io~projected/kube-api-access-n7784 DeviceMajor:0 DeviceMinor:548 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:567 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e45616db-f7dd-4a08-847f-abf2759d9fa4/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:648 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-875 DeviceMajor:0 DeviceMinor:875 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-554 DeviceMajor:0 DeviceMinor:554 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/94301b494e7fa86c5ac2e6fa986da464195a196e9774c438e5a44b6eb0b525ae/userdata/shm DeviceMajor:0 DeviceMinor:820 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f3b6a8b5-bcaa-47f6-a9d5-6186981191d5/volumes/kubernetes.io~projected/kube-api-access-jdbjk DeviceMajor:0 DeviceMinor:349 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-448 DeviceMajor:0 DeviceMinor:448 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3c3b0d24-ce5e-49c3-a546-874356f75dc6/volumes/kubernetes.io~projected/kube-api-access-pngsr DeviceMajor:0 DeviceMinor:94 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-287 DeviceMajor:0 DeviceMinor:287 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-366 DeviceMajor:0 DeviceMinor:366 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/22e10648-af7c-409e-b947-570e7d807e05/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:436 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e78919d3ec5c9e1fc04085900a692953e2087a6d624466d667eb24bc45d8ddb6/userdata/shm DeviceMajor:0 DeviceMinor:97 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-132 DeviceMajor:0 DeviceMinor:132 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6611e325-6152-480c-9c2c-1b503e49ccd2/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:226 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2c0b2d29cecf537e4921aa4396580e1259d6519819244de28dc54db9b3eeb9d0/userdata/shm DeviceMajor:0 DeviceMinor:656 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c238dcb10339e469e019f35f43263a486da7ad20431c7557165dd244d72db205/userdata/shm DeviceMajor:0 DeviceMinor:89 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-103 DeviceMajor:0 DeviceMinor:103 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-541 DeviceMajor:0 DeviceMinor:541 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/034cad93-a500-4c58-8d97-fa49866a0d5e/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:800 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1116 DeviceMajor:0 DeviceMinor:1116 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8797022c969de9642db09ed804cdf4aed14c8648d4f8b5b9c9f88a55664979e8/userdata/shm DeviceMajor:0 DeviceMinor:280 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1154 DeviceMajor:0 DeviceMinor:1154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e5078f17-bc65-460f-9f18-8c506db6840b/volumes/kubernetes.io~projected/kube-api-access-s5rm4 DeviceMajor:0 DeviceMinor:215 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-482 DeviceMajor:0 DeviceMinor:482 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-633 DeviceMajor:0 DeviceMinor:633 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-160 DeviceMajor:0 DeviceMinor:160 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/492d91fc21d30f80345040a63ee30545a1658028ca8d297dee64246b255c0fcb/userdata/shm DeviceMajor:0 DeviceMinor:813 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-165 DeviceMajor:0 DeviceMinor:165 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-538 DeviceMajor:0 DeviceMinor:538 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:overlay_0-887 DeviceMajor:0 DeviceMinor:887 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/75aedbcd-f6ed-43a1-941b-4b04887ffe8e/volumes/kubernetes.io~projected/kube-api-access-dd6rv DeviceMajor:0 DeviceMinor:802 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1112 DeviceMajor:0 DeviceMinor:1112 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-152 DeviceMajor:0 DeviceMinor:152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6230ed8f-4608-4168-8f5a-656f411b0ef7/volumes/kubernetes.io~projected/kube-api-access-wzrh8 DeviceMajor:0 DeviceMinor:303 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-702 DeviceMajor:0 DeviceMinor:702 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert DeviceMajor:0 DeviceMinor:798 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/36816d6fc70a9260d540c9487629bb4d582fa5330a4c11074ee3f05c1e9cbe38/userdata/shm DeviceMajor:0 DeviceMinor:969 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6870ccc7-2094-48d8-9238-f486a4b8d5af/volumes/kubernetes.io~projected/kube-api-access-9dg9r DeviceMajor:0 DeviceMinor:1046 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-134 DeviceMajor:0 DeviceMinor:134 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:227 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-528 DeviceMajor:0 DeviceMinor:528 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/daf4dbb6-5a0a-4c92-a930-479a7330ace1/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:124 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-138 DeviceMajor:0 DeviceMinor:138 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-194 DeviceMajor:0 DeviceMinor:194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-857 DeviceMajor:0 DeviceMinor:857 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-998 DeviceMajor:0 DeviceMinor:998 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ac09dba7-398c-4b0a-a415-edb73cb4cf30/volumes/kubernetes.io~projected/kube-api-access-pbhv4 DeviceMajor:0 DeviceMinor:803 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1097 DeviceMajor:0 DeviceMinor:1097 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-691 DeviceMajor:0 DeviceMinor:691 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/937722eb1a7c864cdcd30f00f097350a31b04584988bb543632ba097925b3bbe/userdata/shm DeviceMajor:0 DeviceMinor:62 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3ca871a2e4c187593092b1e6a4a9637d7435e4628b01bcadfea7c6a9560eeb21/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d016f849de27f64d027bbd73120eb329b0253680086fad1c1a5d1d59daba5c27/userdata/shm DeviceMajor:0 DeviceMinor:111 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-116 DeviceMajor:0 DeviceMinor:116 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-378 DeviceMajor:0 DeviceMinor:378 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-513 DeviceMajor:0 DeviceMinor:513 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2d63d5a8-f45d-4678-824d-5534b2bcd6ca/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1089 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-881 DeviceMajor:0 DeviceMinor:881 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6e76fc3f-39a4-4f99-8603-38a94da6ea8e/volumes/kubernetes.io~projected/kube-api-access-5th4l DeviceMajor:0 DeviceMinor:375 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-704 DeviceMajor:0 DeviceMinor:704 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/00dd3703-af25-4e71-b20b-b3e153383489/volumes/kubernetes.io~projected/kube-api-access-k9ddk DeviceMajor:0 DeviceMinor:364 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e4a5278beb2dd7685cc80a2eb75df7f2fe99740c2893e28197254b1cb14f8f97/userdata/shm DeviceMajor:0 DeviceMinor:809 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-861 DeviceMajor:0 DeviceMinor:861 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c21d5cdcf33dc5445d398db5efae2e61668498b313fd2a8200f2011b9857d1d4/userdata/shm DeviceMajor:0 DeviceMinor:50 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e45616db-f7dd-4a08-847f-abf2759d9fa4/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:642 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-69 DeviceMajor:0 DeviceMinor:69 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-923 DeviceMajor:0 DeviceMinor:923 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-925 DeviceMajor:0 DeviceMinor:925 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/var/lib/kubelet/pods/3053504d-0734-4def-b639-0f5cc2178185/volumes/kubernetes.io~projected/kube-api-access-2bb2x DeviceMajor:0 DeviceMinor:127 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4e2c195f-e97d-4cac-81fc-2d5c551d1c30/volumes/kubernetes.io~projected/kube-api-access-kgz7q DeviceMajor:0 DeviceMinor:242 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-654 DeviceMajor:0 DeviceMinor:654 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-996 DeviceMajor:0 DeviceMinor:996 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e2ad29ad-70ef-43c6-91f6-02f04d145673/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:1012 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-342 DeviceMajor:0 DeviceMinor:342 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/24f71770-714e-4111-9188-ad8663c6baa7/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:837 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/adc1d294cb2c4faeba2726e706421944c88f312613f6f9484ed976e0c65190f9/userdata/shm DeviceMajor:0 DeviceMinor:994 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/66f88242-8b0b-4790-bbb6-445c19b34ee7/volumes/kubernetes.io~projected/kube-api-access-p5fnx DeviceMajor:0 DeviceMinor:245 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-71 DeviceMajor:0 DeviceMinor:71 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:224 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b54d0875a5c74a95cdb12684066d437927027dd749aa30fdd27e9a88de808b47/userdata/shm DeviceMajor:0 DeviceMinor:236 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/8438d015-106b-4aed-ae12-dda781ce51fc/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:140 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-720 DeviceMajor:0 DeviceMinor:720 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-471 DeviceMajor:0 DeviceMinor:471 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-883 DeviceMajor:0 DeviceMinor:883 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b3de8a1b-a5be-414f-86e8-738e16c8bc97/volumes/kubernetes.io~projected/kube-api-access-nlr9q DeviceMajor:0 DeviceMinor:270 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-299 DeviceMajor:0 DeviceMinor:299 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-362 DeviceMajor:0 DeviceMinor:362 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-291 DeviceMajor:0 DeviceMinor:291 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/45a1f521d794b1ca367dab762c38f2fc2e98e9ba7d75ffaddb7fceef49fff20d/userdata/shm DeviceMajor:0 DeviceMinor:464 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/376b18a9-5f33-44fd-a37b-20ab02c5e65d/volumes/kubernetes.io~secret/catalogserver-certs DeviceMajor:0 DeviceMinor:473 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7a51eeaf-1349-4bf3-932d-22ed5ce7c161/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls DeviceMajor:0 DeviceMinor:761 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b5c7eb66-e23e-40df-883c-fed012c07f26/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:789 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d06b230b-db67-4afc-8d10-2c33ad568462/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1082 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/76cf2b01-33d9-47eb-be5d-44946c78bf20/volumes/kubernetes.io~projected/kube-api-access-nj527 DeviceMajor:0 DeviceMinor:697 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e2ad29ad-70ef-43c6-91f6-02f04d145673/volumes/kubernetes.io~secret/stats-auth DeviceMajor:0 DeviceMinor:1010 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1069 DeviceMajor:0 DeviceMinor:1069 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1165 DeviceMajor:0 DeviceMinor:1165 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-644 DeviceMajor:0 DeviceMinor:644 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-368 DeviceMajor:0 DeviceMinor:368 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/163d6a3d-0080-4122-bb7a-17f6e63f00f0/volumes/kubernetes.io~projected/kube-api-access-m7tc5 DeviceMajor:0 DeviceMinor:269 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2d63d5a8-f45d-4678-824d-5534b2bcd6ca/volumes/kubernetes.io~secret/kube-state-metrics-tls DeviceMajor:0 DeviceMinor:1087 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-587 DeviceMajor:0 DeviceMinor:587 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1051 DeviceMajor:0 DeviceMinor:1051 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-77 DeviceMajor:0 DeviceMinor:77 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9778f8f5-b0d1-4967-9776-9db758bba3af/volumes/kubernetes.io~secret/tls-certificates DeviceMajor:0 DeviceMinor:1006 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6611e325-6152-480c-9c2c-1b503e49ccd2/volumes/kubernetes.io~projected/kube-api-access-4p4hg DeviceMajor:0 DeviceMinor:244 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-301 DeviceMajor:0 DeviceMinor:301 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c/volumes/kubernetes.io~projected/kube-api-access-2mxjl DeviceMajor:0 DeviceMinor:577 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5f8c022c-7871-4765-971f-dcafa39357c9/volumes/kubernetes.io~secret/secret-metrics-server-tls DeviceMajor:0 DeviceMinor:1150 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/12f1d4709c9e0d0ad1a233908194f29f84992ca4b99bba4692dddc9f4338c1ef/userdata/shm DeviceMajor:0 DeviceMinor:443 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b3de8a1b-a5be-414f-86e8-738e16c8bc97/volumes/kubernetes.io~secret/marketplace-operator-metrics DeviceMajor:0 DeviceMinor:601 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-345 DeviceMajor:0 DeviceMinor:345 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f7a356015607c77d353df6671f85d12adf9e42d7853bd37134503d15b666f482/userdata/shm DeviceMajor:0 DeviceMinor:583 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fadafce6827750abea7ff3c06bd1da6d8ccd788c149ff361b207b48ae0bcefb8/userdata/shm DeviceMajor:0 DeviceMinor:790 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-937 DeviceMajor:0 DeviceMinor:937 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3c3b0d24-ce5e-49c3-a546-874356f75dc6/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:43 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1/volumes/kubernetes.io~projected/kube-api-access-46m89 DeviceMajor:0 DeviceMinor:239 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a3ceeece-bee9-4fcb-8517-95ebce38e223/volumes/kubernetes.io~projected/kube-api-access-zrgqb DeviceMajor:0 DeviceMinor:255 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/aaaaf539-bf61-44d7-8d47-97535b7aa1ba/volumes/kubernetes.io~secret/node-tuning-operator-tls DeviceMajor:0 DeviceMinor:432 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/aaaaf539-bf61-44d7-8d47-97535b7aa1ba/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:437 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1000 DeviceMajor:0 DeviceMinor:1000 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e8aeb063908bf0937ac94f698cd72366b310f38d0a1756120e33b67a92cd55de/userdata/shm DeviceMajor:0 DeviceMinor:533 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b358fce6bb46e5b5037cb28d6e8fc423fe1541e849c427617b2d5f7f7a209743/userdata/shm DeviceMajor:0 DeviceMinor:818 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/26bcc676a684bba59ce239a7b0c6d837715bffea1d6d9d661570c6d71c3af31c/userdata/shm DeviceMajor:0 DeviceMinor:354 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/376b18a9-5f33-44fd-a37b-20ab02c5e65d/volumes/kubernetes.io~projected/kube-api-access-f2hrw DeviceMajor:0 DeviceMinor:467 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-885 DeviceMajor:0 DeviceMinor:885 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85/volumes/kubernetes.io~projected/kube-api-access-n5skx DeviceMajor:0 DeviceMinor:696 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b5c7eb66-e23e-40df-883c-fed012c07f26/volumes/kubernetes.io~projected/kube-api-access-tx487 DeviceMajor:0 DeviceMinor:796 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a00871e023b42142484ad987a4c2956151fb53dc58e2ab128b59501bf258f39e/userdata/shm DeviceMajor:0 DeviceMinor:1065 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a387a6fb603981d31a2529e0731ac72c41f84be90202777248f07296f1eb9d6b/userdata/shm DeviceMajor:0 DeviceMinor:99 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-370 DeviceMajor:0 DeviceMinor:370 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-552 DeviceMajor:0 DeviceMinor:552 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f4aad0ff-e6cd-4c43-9561-80a14fee4712/volumes/kubernetes.io~projected/kube-api-access-zndqq DeviceMajor:0 DeviceMinor:1060 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-176 DeviceMajor:0 DeviceMinor:176 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-469 DeviceMajor:0 DeviceMinor:469 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-621 DeviceMajor:0 DeviceMinor:621 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1b94d1eb-1b80-4a14-b1c0-d9e192231352/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:462 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-306 DeviceMajor:0 DeviceMinor:306 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6870ccc7-2094-48d8-9238-f486a4b8d5af/volumes/kubernetes.io~secret/certs DeviceMajor:0 DeviceMinor:1038 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:235 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-409 DeviceMajor:0 DeviceMinor:409 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-722 DeviceMajor:0 DeviceMinor:722 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-476 DeviceMajor:0 DeviceMinor:476 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/732989c5-1b89-46f0-9917-b68613f7f005/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:213 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/dd6ec279-d92f-45c2-97c2-88b96fbd6600/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:435 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-361 DeviceMajor:0 DeviceMinor:361 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e2ad29ad-70ef-43c6-91f6-02f04d145673/volumes/kubernetes.io~secret/default-certificate DeviceMajor:0 DeviceMinor:1011 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-738 DeviceMajor:0 DeviceMinor:738 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1023 DeviceMajor:0 DeviceMinor:1023 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-605 DeviceMajor:0 DeviceMinor:605 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-611 DeviceMajor:0 DeviceMinor:611 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8438d015-106b-4aed-ae12-dda781ce51fc/volumes/kubernetes.io~projected/kube-api-access-cqr6w DeviceMajor:0 DeviceMinor:148 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-199 DeviceMajor:0 DeviceMinor:199 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/22e10648-af7c-409e-b947-570e7d807e05/volumes/kubernetes.io~projected/kube-api-access-wls49 DeviceMajor:0 DeviceMinor:220 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8e1fd7c8f094ce1e4302e058e811af6aae2e3addf7bd81aa94568f27af29f0c9/userdata/shm DeviceMajor:0 DeviceMinor:478 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-353 DeviceMajor:0 DeviceMinor:353 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1156 DeviceMajor:0 DeviceMinor:1156 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-273 DeviceMajor:0 DeviceMinor:273 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e48b5aa9-293e-4222-91ff-7640addeca4c/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:431 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-494 DeviceMajor:0 DeviceMinor:494 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-811 DeviceMajor:0 DeviceMinor:811 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d26de8d7725dab288840f8eb4631a12a8821676d8fd47b0810577c9ee4f7e3b9/userdata/shm DeviceMajor:0 DeviceMinor:1152 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e48b5aa9-293e-4222-91ff-7640addeca4c/volumes/kubernetes.io~projected/kube-api-access-88ghj DeviceMajor:0 DeviceMinor:472 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/01bd4a2802323b3faf679fc3ea0fe20efacc45eab046badf1be6c2b07116febc/userdata/shm DeviceMajor:0 DeviceMinor:816 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:overlay_0-84 DeviceMajor:0 DeviceMinor:84 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-836 DeviceMajor:0 DeviceMinor:836 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-910 DeviceMajor:0 DeviceMinor:910 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f29b11ce-60e0-46b3-8d28-eea3452513cd/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:591 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/cf6aab0e-defc-4a4b-8a07-f5af8bf177c4/volumes/kubernetes.io~projected/kube-api-access-894bt DeviceMajor:0 DeviceMinor:337 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e2ad29ad-70ef-43c6-91f6-02f04d145673/volumes/kubernetes.io~projected/kube-api-access-trcb7 DeviceMajor:0 DeviceMinor:1013 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1076 DeviceMajor:0 DeviceMinor:1076 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1029 DeviceMajor:0 DeviceMinor:1029 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/daf4dbb6-5a0a-4c92-a930-479a7330ace1/volumes/kubernetes.io~projected/kube-api-access-72jlb DeviceMajor:0 DeviceMinor:125 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/88f5dfffda4adf62f6636e4646d2c851172ef321255a628934b6453ee67c8f03/userdata/shm DeviceMajor:0 DeviceMinor:128 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3053504d-0734-4def-b639-0f5cc2178185/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:126 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5359d955256489cf75babf6cd7e374f24ca5753414f295ec115bac354fbe37e1/userdata/shm DeviceMajor:0 DeviceMinor:249 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76/volumes/kubernetes.io~projected/kube-api-access-dnl28 DeviceMajor:0 DeviceMinor:252 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-585 DeviceMajor:0 DeviceMinor:585 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e45616db-f7dd-4a08-847f-abf2759d9fa4/volumes/kubernetes.io~projected/kube-api-access-dvkxx DeviceMajor:0 DeviceMinor:649 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/9b61ea14-a7ea-49f3-9df4-5655765ddf7c/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:253 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-778 DeviceMajor:0 DeviceMinor:778 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6956b1980f4b04ce367cfbad3aeea7396b54e1517e031f7afdbbd760960fd241/userdata/shm DeviceMajor:0 DeviceMinor:1113 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/bca8253525b3cd943116e55714fdf37c6331867834b278964c5e6f5dd4c53fef/userdata/shm DeviceMajor:0 DeviceMinor:441 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e8bcebf454a79198b14303cf41946d1cf832021a30a2591e1b23c6740fca1e9b/userdata/shm DeviceMajor:0 DeviceMinor:608 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-727 DeviceMajor:0 DeviceMinor:727 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-859 DeviceMajor:0 DeviceMinor:859 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-931 DeviceMajor:0 DeviceMinor:931 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a2b791c04ceadd3a171b7dda7655ef7534b61d799b6ce663909c8e48a8e61525/userdata/shm DeviceMajor:0 DeviceMinor:1017 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1025 DeviceMajor:0 DeviceMinor:1025 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/857137fd3aca8af8c5c19bcaeff329a322e9a54b7ff7f19d360c176d0e68cab5/userdata/shm DeviceMajor:0 DeviceMinor:254 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/975e632bf87b61a6785fc741d9417b8abbd6243ba2abd8088f9fe581fcfef90c/userdata/shm DeviceMajor:0 DeviceMinor:624 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-852 DeviceMajor:0 DeviceMinor:852 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6870ccc7-2094-48d8-9238-f486a4b8d5af/volumes/kubernetes.io~secret/node-bootstrap-token DeviceMajor:0 DeviceMinor:1037 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/52bdf7cc-f07d-487e-937c-6567f194947e/volumes/kubernetes.io~projected/kube-api-access-8dbmq DeviceMajor:0 DeviceMinor:793 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2d63d5a8-f45d-4678-824d-5534b2bcd6ca/volumes/kubernetes.io~projected/kube-api-access-kwrd5 DeviceMajor:0 DeviceMinor:1091 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/39d3ac31-9259-454b-8e1c-e23024f8f2b2/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:225 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/8376e1f9-ab05-42d4-aa66-284a167a9bfc/volumes/kubernetes.io~empty-dir/tmp DeviceMajor:0 DeviceMinor:547 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/dd6ec279-d92f-45c2-97c2-88b96fbd6600/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:98 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-686 DeviceMajor:0 DeviceMinor:686 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bb22a965-9b36-40cd-993d-747a3978be8e/volumes/kubernetes.io~secret/samples-operator-tls DeviceMajor:0 DeviceMinor:785 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/bb22a965-9b36-40cd-993d-747a3978be8e/volumes/kubernetes.io~projected/kube-api-access-5p55f DeviceMajor:0 DeviceMinor:786 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1/volumes/kubernetes.io~secret/image-registry-operator-tls DeviceMajor:0 DeviceMinor:433 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8f6241ff25db322ca912b366aec02ce24e776e994e5454c053b2a00c5bd1a93b/userdata/shm DeviceMajor:0 DeviceMinor:438 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-304 DeviceMajor:0 DeviceMinor:304 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7a51eeaf-1349-4bf3-932d-22ed5ce7c161/volumes/kubernetes.io~projected/kube-api-access-cfxw7 DeviceMajor:0 DeviceMinor:770 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/cf08ab4f-c203-4c16-9826-8cc049f4af31/volumes/kubernetes.io~projected/kube-api-access-lkm97 DeviceMajor:0 DeviceMinor:214 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a1c364bd3d663a56cc2f90bf6e8ea8c50127add36b90978697972f8218a89ed7/userdata/shm DeviceMajor:0 DeviceMinor:395 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-558 DeviceMajor:0 DeviceMinor:558 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-572 DeviceMajor:0 DeviceMinor:572 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/dedf55c4-eeda-4955-aafe-db1fdb8c4a48/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1088 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-312 DeviceMajor:0 DeviceMinor:312 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1b94d1eb-1b80-4a14-b1c0-d9e192231352/volumes/kubernetes.io~projected/kube-api-access-srlcl DeviceMajor:0 DeviceMinor:463 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-347 DeviceMajor:0 DeviceMinor:347 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab/volumes/kubernetes.io~projected/kube-api-access-8v9bx DeviceMajor:0 DeviceMinor:804 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1121 DeviceMajor:0 DeviceMinor:1121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/dbcbba74-ac53-4724-a217-4d9b85e7c1db/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:238 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/39d3ac31-9259-454b-8e1c-e23024f8f2b2/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:259 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/92e401a4-ed2f-46f7-924b-329d7b313e6a/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls DeviceMajor:0 DeviceMinor:788 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f4aad0ff-e6cd-4c43-9561-80a14fee4712/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1059 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-319 DeviceMajor:0 DeviceMinor:319 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/034cad93-a500-4c58-8d97-fa49866a0d5e/volumes/kubernetes.io~projected/kube-api-access-jptl6 DeviceMajor:0 DeviceMinor:801 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/604f16a2ad7d04e1bbd75b7eca1988232760bcd65e1311be08e2c7a3cbb4c10a/userdata/shm DeviceMajor:0 DeviceMinor:817 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-693 DeviceMajor:0 DeviceMinor:693 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:230 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4b35ea3ef7523bac2219f1d11ac9a4ce57129adbac9b8a1915c2a12e2d7a7c68/userdata/shm DeviceMajor:0 DeviceMinor:308 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-512 DeviceMajor:0 DeviceMinor:512 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/17a53907f75f6dae7caa627268daa345a6154ff885830dae9a1873ed761e0552/userdata/shm DeviceMajor:0 DeviceMinor:1095 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-501 DeviceMajor:0 DeviceMinor:501 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-863 DeviceMajor:0 DeviceMinor:863 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/716c2176-50f9-4c4f-af0e-4c7973457df2/volumes/kubernetes.io~projected/kube-api-access-m8bmw DeviceMajor:0 DeviceMinor:217 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-310 DeviceMajor:0 DeviceMinor:310 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-484 DeviceMajor:0 DeviceMinor:484 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/eab7f63dc5326173ea1e6327285462aa6a81c9b141ac54e3d2487017aec7ef32/userdata/shm DeviceMajor:0 DeviceMinor:698 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c8d8a09f-22d5-4f16-84d6-d5f2c504c949/volumes/kubernetes.io~projected/kube-api-access-p5jsb DeviceMajor:0 DeviceMinor:965 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-873 DeviceMajor:0 DeviceMinor:873 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d7889469ef63ab146c50d169dc4f57ff3c6e05bfe52d83c88832208089809932/userdata/shm DeviceMajor:0 DeviceMinor:119 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/25dfae9bb0843173d90c844dacf16818cb3d6d61cb972bb6cd1177b47a320778/userdata/shm DeviceMajor:0 DeviceMinor:262 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-603 DeviceMajor:0 DeviceMinor:603 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/52bdf7cc-f07d-487e-937c-6567f194947e/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert DeviceMajor:0 DeviceMinor:792 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/75aedbcd-f6ed-43a1-941b-4b04887ffe8e/volumes/kubernetes.io~secret/machine-api-operator-tls DeviceMajor:0 DeviceMinor:799 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-954 DeviceMajor:0 DeviceMinor:954 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:228 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-450 DeviceMajor:0 DeviceMinor:450 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/76cf2b01-33d9-47eb-be5d-44946c78bf20/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:690 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/92e401a4-ed2f-46f7-924b-329d7b313e6a/volumes/kubernetes.io~projected/kube-api-access-c7nhq DeviceMajor:0 DeviceMinor:795 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-707 DeviceMajor:0 DeviceMinor:707 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d2ecc9d2456937963c5f6bc8147a2bbe973205b8a12f3d89082b348a330ba2e2/userdata/shm DeviceMajor:0 DeviceMinor:1047 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1049 DeviceMajor:0 DeviceMinor:1049 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d/volumes/kubernetes.io~projected/kube-api-access-qql5t DeviceMajor:0 DeviceMinor:234 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/163d6a3d-0080-4122-bb7a-17f6e63f00f0/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:243 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e48b5aa9-293e-4222-91ff-7640addeca4c/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:460 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/caa4e9bd96e874f51a79da89bbb64da72933b4ef3464772d351cf399d375866a/userdata/shm DeviceMajor:0 DeviceMinor:477 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-713 DeviceMajor:0 DeviceMinor:713 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d625c81e-01cc-424a-997d-546a5204a72b/volumes/kubernetes.io~projected/kube-api-access-tgzdh DeviceMajor:0 DeviceMinor:318 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e48b5aa9-293e-4222-91ff-7640addeca4c/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:430 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/832f700980eab592f836b89a6aebe98be99148aa95ac29165addb6fccc6389c3/userdata/shm DeviceMajor:0 DeviceMinor:650 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2292109e-92a9-4286-858e-dcd2ac083c43/volumes/kubernetes.io~projected/kube-api-access-8rt57 DeviceMajor:0 DeviceMinor:218 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/82ce23dbad1fafac03170cf8dbdc37b1358bba5d494b0305bc59731ec33ac062/userdata/shm DeviceMajor:0 DeviceMinor:614 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-314 DeviceMajor:0 DeviceMinor:314 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ee84c91e209b8d15a57102d97b9b923b7a0a0247657f697f48616f38ce178b0b/userdata/shm DeviceMajor:0 DeviceMinor:221 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-67 DeviceMajor:0 DeviceMinor:67 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103/volumes/kubernetes.io~projected/kube-api-access-hg6sp DeviceMajor:0 DeviceMinor:359 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-297 DeviceMajor:0 DeviceMinor:297 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c7fce19a33a5dd46ce06e3ec2001f8aae0d2c521be7c2647e59448b0833408c9/userdata/shm DeviceMajor:0 DeviceMinor:46 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-678 DeviceMajor:0 DeviceMinor:678 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:695 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-734 DeviceMajor:0 DeviceMinor:734 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-658 DeviceMajor:0 DeviceMinor:658 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ebe898baef0ea3cf7f17e803722db21b9281248ac2ac1d6fe40d8e59580a9cee/userdata/shm DeviceMajor:0 DeviceMinor:646 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/66f88242-8b0b-4790-bbb6-445c19b34ee7/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:231 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-257 DeviceMajor:0 DeviceMinor:257 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-333 DeviceMajor:0 DeviceMinor:333 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1c898657-f06b-44ab-95ff-53a324759ba1/volumes/kubernetes.io~projected/kube-api-access-mt6bf DeviceMajor:0 DeviceMinor:578 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf/volumes/kubernetes.io~projected/kube-api-access-nds54 DeviceMajor:0 DeviceMinor:251 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-555 DeviceMajor:0 DeviceMinor:555 Capacity:214143315968 Type:vfs In Mar 19 11:59:26.414476 master-0 kubenswrapper[17644]: odes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/12809811-c9df-4e77-8c12-309831b8975d/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:989 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1123 DeviceMajor:0 DeviceMinor:1123 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6e76fc3f-39a4-4f99-8603-38a94da6ea8e/volumes/kubernetes.io~secret/signing-key DeviceMajor:0 DeviceMinor:374 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-109 DeviceMajor:0 DeviceMinor:109 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-156 DeviceMajor:0 DeviceMinor:156 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-530 DeviceMajor:0 DeviceMinor:530 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-743 DeviceMajor:0 DeviceMinor:743 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-95 DeviceMajor:0 DeviceMinor:95 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-403 DeviceMajor:0 DeviceMinor:403 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4f25a976585d22d9ce3955473a200e96837f45c766e321488b3d87050f023b7a/userdata/shm DeviceMajor:0 DeviceMinor:439 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-681 DeviceMajor:0 DeviceMinor:681 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-82 DeviceMajor:0 DeviceMinor:82 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1135 DeviceMajor:0 DeviceMinor:1135 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f29b11ce-60e0-46b3-8d28-eea3452513cd/volumes/kubernetes.io~projected/kube-api-access-bgs4l DeviceMajor:0 DeviceMinor:123 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9a2616ea0257b4942755b9e9fb23bb4dfd3518f40e9ffe96a9ef4230caaa00fe/userdata/shm DeviceMajor:0 DeviceMinor:557 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/24f71770-714e-4111-9188-ad8663c6baa7/volumes/kubernetes.io~projected/kube-api-access-m287x DeviceMajor:0 DeviceMinor:842 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc/volumes/kubernetes.io~secret/machine-approver-tls DeviceMajor:0 DeviceMinor:780 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e6908445d4f9d29994371a77f0165de1617d0b3d69f7e33acfc73003f26e2111/userdata/shm DeviceMajor:0 DeviceMinor:657 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c0337cf9dcdc7cc749cac3adad0f44d0d5457a466ca84750f37317d1eb4a70f1/userdata/shm DeviceMajor:0 DeviceMinor:699 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-65 DeviceMajor:0 DeviceMinor:65 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a/volumes/kubernetes.io~projected/kube-api-access-lqcvx DeviceMajor:0 DeviceMinor:118 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9/volumes/kubernetes.io~projected/kube-api-access-79qrb DeviceMajor:0 DeviceMinor:233 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-316 DeviceMajor:0 DeviceMinor:316 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1108 DeviceMajor:0 DeviceMinor:1108 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e65e2a2f-16b5-44a3-9860-741f70188ab5/volumes/kubernetes.io~projected/kube-api-access-4fvvj DeviceMajor:0 DeviceMinor:1014 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e01c0d4f6330ee155cedce051137a3842f3cbc1b8b4039503e3a3e9fd950bf49/userdata/shm DeviceMajor:0 DeviceMinor:1019 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-44 DeviceMajor:0 DeviceMinor:44 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-977 DeviceMajor:0 DeviceMinor:977 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-289 DeviceMajor:0 DeviceMinor:289 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-955 DeviceMajor:0 DeviceMinor:955 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/dedf55c4-eeda-4955-aafe-db1fdb8c4a48/volumes/kubernetes.io~projected/kube-api-access-lscpq DeviceMajor:0 DeviceMinor:1092 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/38cb26629a14fdae9d7f35eac30d1706193c11f4823405b1ab890376e3178bdd/userdata/shm DeviceMajor:0 DeviceMinor:684 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-101 DeviceMajor:0 DeviceMinor:101 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-458 DeviceMajor:0 DeviceMinor:458 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-58 DeviceMajor:0 DeviceMinor:58 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6502c99aaf4d4f945a08ddd70ddf47028a9961291a598bc4054d9498e0e3049e/userdata/shm DeviceMajor:0 DeviceMinor:444 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-564 DeviceMajor:0 DeviceMinor:564 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1027 DeviceMajor:0 DeviceMinor:1027 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d6045bc934b39d2e74e105cd2ee97a2d4e1d69429d08a4cbb80aeb107f492bc3/userdata/shm DeviceMajor:0 DeviceMinor:580 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/cf08ab4f-c203-4c16-9826-8cc049f4af31/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:599 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-668 DeviceMajor:0 DeviceMinor:668 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-877 DeviceMajor:0 DeviceMinor:877 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-847 DeviceMajor:0 DeviceMinor:847 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2151069adcf5d6126fb57190dc2ec941b6dc342421174da0283b995f56e1641b/userdata/shm DeviceMajor:0 DeviceMinor:129 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/14576382107dc09a133f25dfe11c859b57d691f83816910915dfdbd5db8c6773/userdata/shm DeviceMajor:0 DeviceMinor:622 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1137 DeviceMajor:0 DeviceMinor:1137 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/18f4fa41b32bbdc0315d2c159f68c3407a8234dacc09fe18dea04525d0e88d8c/userdata/shm DeviceMajor:0 DeviceMinor:808 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-855 DeviceMajor:0 DeviceMinor:855 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/738ee83a26962b58779f847316e3a8a5be1d6bd92f0c0f29c25cdbb8703c5c59/userdata/shm DeviceMajor:0 DeviceMinor:771 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6d41245b-33d4-40f8-bbe1-6d2247e2e335/volumes/kubernetes.io~projected/kube-api-access-k7bq7 DeviceMajor:0 DeviceMinor:807 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-184 DeviceMajor:0 DeviceMinor:184 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-264 DeviceMajor:0 DeviceMinor:264 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e45616db-f7dd-4a08-847f-abf2759d9fa4/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:647 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/862dbe8b648f15ee3ab2e74272152e657f518e1985ef0d38baf17c28a33a4abb/userdata/shm DeviceMajor:0 DeviceMinor:607 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1099 DeviceMajor:0 DeviceMinor:1099 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/09a22c25-6073-4b1a-a029-928452ef37db/volumes/kubernetes.io~projected/kube-api-access-xx4wk DeviceMajor:0 DeviceMinor:105 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/159515e88e5e657c5dc1a45dfc38f542a76bac41085e0be14941a32b19e214ef/userdata/shm DeviceMajor:0 DeviceMinor:276 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d06b230b-db67-4afc-8d10-2c33ad568462/volumes/kubernetes.io~secret/node-exporter-tls DeviceMajor:0 DeviceMinor:1102 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3053504d-0734-4def-b639-0f5cc2178185/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7/volumes/kubernetes.io~secret/webhook-certs DeviceMajor:0 DeviceMinor:502 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/27419838ac6bb228f6151c74e466e550ee30c7ce1c14772f63c150dcd524d6e7/userdata/shm DeviceMajor:0 DeviceMinor:279 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-295 DeviceMajor:0 DeviceMinor:295 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-452 DeviceMajor:0 DeviceMinor:452 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-821 DeviceMajor:0 DeviceMinor:821 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-833 DeviceMajor:0 DeviceMinor:833 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c8d8a09f-22d5-4f16-84d6-d5f2c504c949/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls DeviceMajor:0 DeviceMinor:934 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d06b230b-db67-4afc-8d10-2c33ad568462/volumes/kubernetes.io~projected/kube-api-access-4bbtl DeviceMajor:0 DeviceMinor:1090 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-524 DeviceMajor:0 DeviceMinor:524 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-879 DeviceMajor:0 DeviceMinor:879 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1074 DeviceMajor:0 DeviceMinor:1074 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-247 DeviceMajor:0 DeviceMinor:247 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/163d6a3d-0080-4122-bb7a-17f6e63f00f0/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:434 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/aa3b1b6a2b92eddacf5dacab6a0147cb12a4e498e9d143158aa50de12bd5c3b7/userdata/shm DeviceMajor:0 DeviceMinor:509 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-266 DeviceMajor:0 DeviceMinor:266 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/67cce31157aba8cd32c19fdb97a814cf6764c07048a060294135b6ce20e85f0e/userdata/shm DeviceMajor:0 DeviceMinor:275 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-411 DeviceMajor:0 DeviceMinor:411 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9b61ea14-a7ea-49f3-9df4-5655765ddf7c/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:229 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-492 DeviceMajor:0 DeviceMinor:492 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5f8c022c-7871-4765-971f-dcafa39357c9/volumes/kubernetes.io~secret/secret-metrics-client-certs DeviceMajor:0 DeviceMinor:1149 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-121 DeviceMajor:0 DeviceMinor:121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-909 DeviceMajor:0 DeviceMinor:909 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-984 DeviceMajor:0 DeviceMinor:984 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-828 DeviceMajor:0 DeviceMinor:828 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-844 DeviceMajor:0 DeviceMinor:844 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f5d73fef-1414-4b29-97ea-42e1c0b1ef18/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:209 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d1eef757-d63a-4708-8efe-7b27eea1ff63/volumes/kubernetes.io~projected/kube-api-access-kbq7n DeviceMajor:0 DeviceMinor:79 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/92e401a4-ed2f-46f7-924b-329d7b313e6a/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:791 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8070d874c8e6aab4717f63db58f142956c8f18d4e16e21f12ce84898692af2f8/userdata/shm DeviceMajor:0 DeviceMinor:812 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/8376e1f9-ab05-42d4-aa66-284a167a9bfc/volumes/kubernetes.io~empty-dir/etc-tuned DeviceMajor:0 DeviceMinor:546 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-596 DeviceMajor:0 DeviceMinor:596 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-708 DeviceMajor:0 DeviceMinor:708 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/dedf55c4-eeda-4955-aafe-db1fdb8c4a48/volumes/kubernetes.io~secret/openshift-state-metrics-tls DeviceMajor:0 DeviceMinor:1086 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8ce4d1f32bb0cc2bd6719ecbc1bb660798af73ec1a021eb215e32bb686d9ba1b/userdata/shm DeviceMajor:0 DeviceMinor:1093 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-204 DeviceMajor:0 DeviceMinor:204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1119 DeviceMajor:0 DeviceMinor:1119 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/402476363b5df4845bdf76440169d41c48c7c304f89463a3160ab10c4b9c45da/userdata/shm DeviceMajor:0 DeviceMinor:331 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-615 DeviceMajor:0 DeviceMinor:615 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e5078f17-bc65-460f-9f18-8c506db6840b/volumes/kubernetes.io~secret/package-server-manager-serving-cert DeviceMajor:0 DeviceMinor:600 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1101 DeviceMajor:0 DeviceMinor:1101 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/74d7d2df3602ec247c94c7641e1ca1523b5ae6b42624ca797fbd2b6225dfbfa4/userdata/shm DeviceMajor:0 DeviceMinor:260 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc/volumes/kubernetes.io~projected/kube-api-access-dr788 DeviceMajor:0 DeviceMinor:787 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1072 DeviceMajor:0 DeviceMinor:1072 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-935 DeviceMajor:0 DeviceMinor:935 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/28a32f59656edf5ebf4428eb19343f79c79bdc3e9a5ed63a5fa7185ccacbd30e/userdata/shm DeviceMajor:0 DeviceMinor:282 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-579 DeviceMajor:0 DeviceMinor:579 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-652 DeviceMajor:0 DeviceMinor:652 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/716c2176-50f9-4c4f-af0e-4c7973457df2/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:602 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6d41245b-33d4-40f8-bbe1-6d2247e2e335/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:805 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-979 DeviceMajor:0 DeviceMinor:979 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/23f47642dd95b352c86bf3516967ac9ae86ccfd441d6afb36a3e2d4a5c622a4a/userdata/shm DeviceMajor:0 DeviceMinor:144 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-356 DeviceMajor:0 DeviceMinor:356 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/12809811-c9df-4e77-8c12-309831b8975d/volumes/kubernetes.io~projected/kube-api-access-bdx6s DeviceMajor:0 DeviceMinor:993 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ac09dba7-398c-4b0a-a415-edb73cb4cf30/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:797 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a3ceeece-bee9-4fcb-8517-95ebce38e223/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:223 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-285 DeviceMajor:0 DeviceMinor:285 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-456 DeviceMajor:0 DeviceMinor:456 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls DeviceMajor:0 DeviceMinor:590 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-352 DeviceMajor:0 DeviceMinor:352 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d9bf0e017da39714ca0d58a2ba0c46cd89a43ae7f317d13dbb6e31831feeb576/userdata/shm DeviceMajor:0 DeviceMinor:271 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-405 DeviceMajor:0 DeviceMinor:405 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5f8c022c-7871-4765-971f-dcafa39357c9/volumes/kubernetes.io~projected/kube-api-access-g997b DeviceMajor:0 DeviceMinor:1151 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/dbcbba74-ac53-4724-a217-4d9b85e7c1db/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:232 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-534 DeviceMajor:0 DeviceMinor:534 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1021 DeviceMajor:0 DeviceMinor:1021 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5f8c022c-7871-4765-971f-dcafa39357c9/volumes/kubernetes.io~secret/client-ca-bundle DeviceMajor:0 DeviceMinor:1145 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-150 DeviceMajor:0 DeviceMinor:150 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/376b18a9-5f33-44fd-a37b-20ab02c5e65d/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:466 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-416 DeviceMajor:0 DeviceMinor:416 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-975 DeviceMajor:0 DeviceMinor:975 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-830 DeviceMajor:0 DeviceMinor:830 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5a3a840584953aa05811a73f7731c28fab3047c34d3f28cfbf2a20aad97cf6c3/userdata/shm DeviceMajor:0 DeviceMinor:1015 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7/volumes/kubernetes.io~projected/kube-api-access-7spvn DeviceMajor:0 DeviceMinor:246 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-967 DeviceMajor:0 DeviceMinor:967 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f5d73fef-1414-4b29-97ea-42e1c0b1ef18/volumes/kubernetes.io~projected/kube-api-access-v27lg DeviceMajor:0 DeviceMinor:219 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-293 DeviceMajor:0 DeviceMinor:293 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-454 DeviceMajor:0 DeviceMinor:454 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6d41245b-33d4-40f8-bbe1-6d2247e2e335/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:806 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/683aa0635e184216531580a563438a1b652c9e9d46d69283fd6cdf0548cf223d/userdata/shm DeviceMajor:0 DeviceMinor:814 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:01bd4a2802323b3 MacAddress:52:b8:2d:a9:8c:45 Speed:10000 Mtu:8900} {Name:14576382107dc09 MacAddress:96:22:03:64:08:d9 Speed:10000 Mtu:8900} {Name:159515e88e5e657 MacAddress:ea:1e:ac:bd:92:e0 Speed:10000 Mtu:8900} {Name:17a53907f75f6da MacAddress:fe:c2:54:14:a7:bc Speed:10000 Mtu:8900} {Name:18f4fa41b32bbdc MacAddress:12:32:3a:92:61:79 Speed:10000 Mtu:8900} {Name:26bcc676a684bba MacAddress:ca:d5:63:08:87:29 Speed:10000 Mtu:8900} {Name:27419838ac6bb22 MacAddress:d2:e6:d1:53:c7:dd Speed:10000 Mtu:8900} {Name:28a32f59656edf5 MacAddress:ea:22:69:45:52:8e Speed:10000 Mtu:8900} {Name:2c0b2d29cecf537 MacAddress:36:28:be:34:40:0b Speed:10000 Mtu:8900} {Name:402476363b5df48 MacAddress:fe:f5:31:ca:f5:4e Speed:10000 Mtu:8900} {Name:45a1f521d794b1c MacAddress:62:29:66:b2:57:c6 Speed:10000 Mtu:8900} {Name:492d91fc21d30f8 MacAddress:2a:48:2a:a2:40:5b Speed:10000 Mtu:8900} {Name:4b35ea3ef7523ba MacAddress:a6:9e:7c:88:ae:3f Speed:10000 Mtu:8900} {Name:4f25a976585d22d MacAddress:0a:da:03:dc:33:1b Speed:10000 Mtu:8900} {Name:5359d955256489c MacAddress:1a:a2:18:65:79:1c Speed:10000 Mtu:8900} {Name:5a3a840584953aa MacAddress:8a:6f:4d:e7:66:2b Speed:10000 Mtu:8900} {Name:604f16a2ad7d04e MacAddress:8e:be:9c:5a:50:40 Speed:10000 Mtu:8900} {Name:6502c99aaf4d4f9 MacAddress:76:17:9b:41:87:e3 Speed:10000 Mtu:8900} {Name:67cce31157aba8c MacAddress:a2:bc:58:e8:51:35 Speed:10000 Mtu:8900} {Name:683aa0635e18421 MacAddress:4a:08:87:13:a2:3b Speed:10000 Mtu:8900} {Name:738ee83a26962b5 MacAddress:da:18:a8:77:4a:87 Speed:10000 Mtu:8900} {Name:7444b7503d7740b MacAddress:46:4f:91:cb:b3:6d Speed:10000 Mtu:8900} {Name:74d7d2df3602ec2 MacAddress:c2:68:a6:39:da:fb Speed:10000 Mtu:8900} {Name:8070d874c8e6aab MacAddress:a6:44:93:03:87:fb Speed:10000 Mtu:8900} {Name:82ce23dbad1fafa MacAddress:5e:43:35:50:57:9a Speed:10000 Mtu:8900} {Name:832f700980eab59 MacAddress:42:cc:86:e2:b2:7d Speed:10000 Mtu:8900} {Name:857137fd3aca8af MacAddress:3a:2a:7d:8a:54:d8 Speed:10000 Mtu:8900} {Name:862dbe8b648f15e MacAddress:02:05:c1:8d:cc:b2 Speed:10000 Mtu:8900} {Name:8797022c969de96 MacAddress:e6:77:37:0f:25:94 Speed:10000 Mtu:8900} {Name:8ce4d1f32bb0cc2 MacAddress:4a:09:0d:bc:a3:a7 Speed:10000 Mtu:8900} {Name:8e1fd7c8f094ce1 MacAddress:2e:76:53:9d:06:63 Speed:10000 Mtu:8900} {Name:8f6241ff25db322 MacAddress:2e:3f:94:e7:c8:77 Speed:10000 Mtu:8900} {Name:937722eb1a7c864 MacAddress:96:c1:2f:7b:a0:dc Speed:10000 Mtu:8900} {Name:94301b494e7fa86 MacAddress:4a:d3:1e:7f:1b:b1 Speed:10000 Mtu:8900} {Name:975e632bf87b61a MacAddress:a6:e3:19:f4:34:b7 Speed:10000 Mtu:8900} {Name:a00871e023b4214 MacAddress:5e:a0:3f:77:8f:ac Speed:10000 Mtu:8900} {Name:a1c364bd3d663a5 MacAddress:ba:99:c3:7f:56:e7 Speed:10000 Mtu:8900} {Name:a2b791c04ceadd3 MacAddress:de:69:59:27:0d:46 Speed:10000 Mtu:8900} {Name:aa3b1b6a2b92edd MacAddress:c6:58:c0:5d:1b:76 Speed:10000 Mtu:8900} {Name:adc1d294cb2c4fa MacAddress:3e:78:6c:7d:bb:51 Speed:10000 Mtu:8900} {Name:b358fce6bb46e5b MacAddress:96:9b:c5:19:1a:99 Speed:10000 Mtu:8900} {Name:b54d0875a5c74a9 MacAddress:aa:0e:5c:f8:98:4c Speed:10000 Mtu:8900} {Name:bca8253525b3cd9 MacAddress:76:88:ed:5d:9d:4a Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:a2:b3:fa:06:64:87 Speed:0 Mtu:8900} {Name:c0337cf9dcdc7cc MacAddress:d6:4e:31:25:33:e0 Speed:10000 Mtu:8900} {Name:caa4e9bd96e874f MacAddress:82:bf:49:c7:31:5b Speed:10000 Mtu:8900} {Name:d26de8d7725dab2 MacAddress:f2:42:54:ad:2b:a1 Speed:10000 Mtu:8900} {Name:d6045bc934b39d2 MacAddress:ce:70:bc:e2:12:0e Speed:10000 Mtu:8900} {Name:d9bf0e017da3971 MacAddress:9e:4b:91:5e:1d:32 Speed:10000 Mtu:8900} {Name:e4a5278beb2dd76 MacAddress:5e:3d:28:9d:bf:63 Speed:10000 Mtu:8900} {Name:e6908445d4f9d29 MacAddress:46:4e:f0:39:6a:a2 Speed:10000 Mtu:8900} {Name:e8aeb063908bf09 MacAddress:c2:59:3a:3f:5c:46 Speed:10000 Mtu:8900} {Name:e8bcebf454a7919 MacAddress:ae:ba:08:fd:29:a0 Speed:10000 Mtu:8900} {Name:eab7f63dc532617 MacAddress:4e:a4:0f:88:43:9a Speed:10000 Mtu:8900} {Name:ebe898baef0ea3c MacAddress:d6:94:59:8f:ac:8c Speed:10000 Mtu:8900} {Name:ee84c91e209b8d1 MacAddress:9e:0f:da:81:b7:77 Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:3b:cf:f0 Speed:-1 Mtu:9000} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:0e:7e:5d:df:5c:77 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 19 11:59:26.414476 master-0 kubenswrapper[17644]: I0319 11:59:26.413992 17644 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 19 11:59:26.414476 master-0 kubenswrapper[17644]: I0319 11:59:26.414072 17644 manager.go:233] Version: {KernelVersion:5.14.0-427.113.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202603021444-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 19 11:59:26.414476 master-0 kubenswrapper[17644]: I0319 11:59:26.414336 17644 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 19 11:59:26.414858 master-0 kubenswrapper[17644]: I0319 11:59:26.414519 17644 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 19 11:59:26.414858 master-0 kubenswrapper[17644]: I0319 11:59:26.414558 17644 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 19 11:59:26.414858 master-0 kubenswrapper[17644]: I0319 11:59:26.414824 17644 topology_manager.go:138] "Creating topology manager with none policy" Mar 19 11:59:26.414858 master-0 kubenswrapper[17644]: I0319 11:59:26.414838 17644 container_manager_linux.go:303] "Creating device plugin manager" Mar 19 11:59:26.414858 master-0 kubenswrapper[17644]: I0319 11:59:26.414847 17644 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 11:59:26.415000 master-0 kubenswrapper[17644]: I0319 11:59:26.414873 17644 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 11:59:26.415000 master-0 kubenswrapper[17644]: I0319 11:59:26.414914 17644 state_mem.go:36] "Initialized new in-memory state store" Mar 19 11:59:26.415058 master-0 kubenswrapper[17644]: I0319 11:59:26.415016 17644 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 19 11:59:26.415106 master-0 kubenswrapper[17644]: I0319 11:59:26.415080 17644 kubelet.go:418] "Attempting to sync node with API server" Mar 19 11:59:26.415106 master-0 kubenswrapper[17644]: I0319 11:59:26.415100 17644 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 19 11:59:26.415176 master-0 kubenswrapper[17644]: I0319 11:59:26.415119 17644 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 19 11:59:26.415176 master-0 kubenswrapper[17644]: I0319 11:59:26.415134 17644 kubelet.go:324] "Adding apiserver pod source" Mar 19 11:59:26.415176 master-0 kubenswrapper[17644]: I0319 11:59:26.415154 17644 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 19 11:59:26.418491 master-0 kubenswrapper[17644]: I0319 11:59:26.418281 17644 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 19 11:59:26.419643 master-0 kubenswrapper[17644]: I0319 11:59:26.418966 17644 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 19 11:59:26.419643 master-0 kubenswrapper[17644]: I0319 11:59:26.419385 17644 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 19 11:59:26.419966 master-0 kubenswrapper[17644]: I0319 11:59:26.419883 17644 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 19 11:59:26.419966 master-0 kubenswrapper[17644]: I0319 11:59:26.419912 17644 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 19 11:59:26.419966 master-0 kubenswrapper[17644]: I0319 11:59:26.419921 17644 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 19 11:59:26.419966 master-0 kubenswrapper[17644]: I0319 11:59:26.419939 17644 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 19 11:59:26.419966 master-0 kubenswrapper[17644]: I0319 11:59:26.419948 17644 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 19 11:59:26.419966 master-0 kubenswrapper[17644]: I0319 11:59:26.419956 17644 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 19 11:59:26.419966 master-0 kubenswrapper[17644]: I0319 11:59:26.419965 17644 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 19 11:59:26.419966 master-0 kubenswrapper[17644]: I0319 11:59:26.419972 17644 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 19 11:59:26.420554 master-0 kubenswrapper[17644]: I0319 11:59:26.419986 17644 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 19 11:59:26.420554 master-0 kubenswrapper[17644]: I0319 11:59:26.419996 17644 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 19 11:59:26.420554 master-0 kubenswrapper[17644]: I0319 11:59:26.420029 17644 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 19 11:59:26.420554 master-0 kubenswrapper[17644]: I0319 11:59:26.420053 17644 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 19 11:59:26.420554 master-0 kubenswrapper[17644]: I0319 11:59:26.420094 17644 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 19 11:59:26.421182 master-0 kubenswrapper[17644]: I0319 11:59:26.420850 17644 server.go:1280] "Started kubelet" Mar 19 11:59:26.422503 master-0 kubenswrapper[17644]: I0319 11:59:26.422243 17644 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 19 11:59:26.422409 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 19 11:59:26.423576 master-0 kubenswrapper[17644]: I0319 11:59:26.422414 17644 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 19 11:59:26.423576 master-0 kubenswrapper[17644]: I0319 11:59:26.422910 17644 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 19 11:59:26.427488 master-0 kubenswrapper[17644]: I0319 11:59:26.427418 17644 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 19 11:59:26.434181 master-0 kubenswrapper[17644]: I0319 11:59:26.427945 17644 server.go:449] "Adding debug handlers to kubelet server" Mar 19 11:59:26.445319 master-0 kubenswrapper[17644]: E0319 11:59:26.445259 17644 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 19 11:59:26.445760 master-0 kubenswrapper[17644]: I0319 11:59:26.445669 17644 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 19 11:59:26.446133 master-0 kubenswrapper[17644]: I0319 11:59:26.446001 17644 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-20 11:43:03 +0000 UTC, rotation deadline is 2026-03-20 08:22:28.364402752 +0000 UTC Mar 19 11:59:26.446133 master-0 kubenswrapper[17644]: I0319 11:59:26.446035 17644 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 20h23m1.918369802s for next certificate rotation Mar 19 11:59:26.446133 master-0 kubenswrapper[17644]: I0319 11:59:26.445750 17644 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 19 11:59:26.448432 master-0 kubenswrapper[17644]: I0319 11:59:26.448397 17644 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 19 11:59:26.448432 master-0 kubenswrapper[17644]: I0319 11:59:26.448425 17644 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 19 11:59:26.448609 master-0 kubenswrapper[17644]: I0319 11:59:26.448588 17644 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 19 11:59:26.450025 master-0 kubenswrapper[17644]: E0319 11:59:26.449941 17644 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:59:26.455379 master-0 kubenswrapper[17644]: I0319 11:59:26.455344 17644 factory.go:153] Registering CRI-O factory Mar 19 11:59:26.455379 master-0 kubenswrapper[17644]: I0319 11:59:26.455378 17644 factory.go:221] Registration of the crio container factory successfully Mar 19 11:59:26.455511 master-0 kubenswrapper[17644]: I0319 11:59:26.455491 17644 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 19 11:59:26.455511 master-0 kubenswrapper[17644]: I0319 11:59:26.455506 17644 factory.go:55] Registering systemd factory Mar 19 11:59:26.455511 master-0 kubenswrapper[17644]: I0319 11:59:26.455513 17644 factory.go:221] Registration of the systemd container factory successfully Mar 19 11:59:26.455600 master-0 kubenswrapper[17644]: I0319 11:59:26.455534 17644 factory.go:103] Registering Raw factory Mar 19 11:59:26.455600 master-0 kubenswrapper[17644]: I0319 11:59:26.455566 17644 manager.go:1196] Started watching for new ooms in manager Mar 19 11:59:26.456173 master-0 kubenswrapper[17644]: I0319 11:59:26.456148 17644 manager.go:319] Starting recovery of all containers Mar 19 11:59:26.461983 master-0 kubenswrapper[17644]: I0319 11:59:26.461909 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d06b230b-db67-4afc-8d10-2c33ad568462" volumeName="kubernetes.io/configmap/d06b230b-db67-4afc-8d10-2c33ad568462-metrics-client-ca" seLinuxMountContext="" Mar 19 11:59:26.461983 master-0 kubenswrapper[17644]: I0319 11:59:26.461969 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e48b5aa9-293e-4222-91ff-7640addeca4c" volumeName="kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-trusted-ca-bundle" seLinuxMountContext="" Mar 19 11:59:26.461983 master-0 kubenswrapper[17644]: I0319 11:59:26.461985 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09a22c25-6073-4b1a-a029-928452ef37db" volumeName="kubernetes.io/configmap/09a22c25-6073-4b1a-a029-928452ef37db-multus-daemon-config" seLinuxMountContext="" Mar 19 11:59:26.462157 master-0 kubenswrapper[17644]: I0319 11:59:26.461996 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f8c022c-7871-4765-971f-dcafa39357c9" volumeName="kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-configmap-kubelet-serving-ca-bundle" seLinuxMountContext="" Mar 19 11:59:26.462157 master-0 kubenswrapper[17644]: I0319 11:59:26.462007 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="75aedbcd-f6ed-43a1-941b-4b04887ffe8e" volumeName="kubernetes.io/secret/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-machine-api-operator-tls" seLinuxMountContext="" Mar 19 11:59:26.462157 master-0 kubenswrapper[17644]: I0319 11:59:26.462017 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76cf2b01-33d9-47eb-be5d-44946c78bf20" volumeName="kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-proxy-ca-bundles" seLinuxMountContext="" Mar 19 11:59:26.462157 master-0 kubenswrapper[17644]: I0319 11:59:26.462027 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8438d015-106b-4aed-ae12-dda781ce51fc" volumeName="kubernetes.io/configmap/8438d015-106b-4aed-ae12-dda781ce51fc-env-overrides" seLinuxMountContext="" Mar 19 11:59:26.462157 master-0 kubenswrapper[17644]: I0319 11:59:26.462038 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="89cf2ee8-3664-4502-b70c-b7e0a5e92cb7" volumeName="kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs" seLinuxMountContext="" Mar 19 11:59:26.462157 master-0 kubenswrapper[17644]: I0319 11:59:26.462050 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e2ad29ad-70ef-43c6-91f6-02f04d145673" volumeName="kubernetes.io/configmap/e2ad29ad-70ef-43c6-91f6-02f04d145673-service-ca-bundle" seLinuxMountContext="" Mar 19 11:59:26.462157 master-0 kubenswrapper[17644]: I0319 11:59:26.462061 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="376b18a9-5f33-44fd-a37b-20ab02c5e65d" volumeName="kubernetes.io/empty-dir/376b18a9-5f33-44fd-a37b-20ab02c5e65d-cache" seLinuxMountContext="" Mar 19 11:59:26.462157 master-0 kubenswrapper[17644]: I0319 11:59:26.462072 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="52bdf7cc-f07d-487e-937c-6567f194947e" volumeName="kubernetes.io/projected/52bdf7cc-f07d-487e-937c-6567f194947e-kube-api-access-8dbmq" seLinuxMountContext="" Mar 19 11:59:26.462157 master-0 kubenswrapper[17644]: I0319 11:59:26.462081 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85" volumeName="kubernetes.io/configmap/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-client-ca" seLinuxMountContext="" Mar 19 11:59:26.462157 master-0 kubenswrapper[17644]: I0319 11:59:26.462090 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="00dd3703-af25-4e71-b20b-b3e153383489" volumeName="kubernetes.io/projected/00dd3703-af25-4e71-b20b-b3e153383489-kube-api-access-k9ddk" seLinuxMountContext="" Mar 19 11:59:26.462157 master-0 kubenswrapper[17644]: I0319 11:59:26.462102 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="716c2176-50f9-4c4f-af0e-4c7973457df2" volumeName="kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert" seLinuxMountContext="" Mar 19 11:59:26.462157 master-0 kubenswrapper[17644]: I0319 11:59:26.462112 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6870ccc7-2094-48d8-9238-f486a4b8d5af" volumeName="kubernetes.io/secret/6870ccc7-2094-48d8-9238-f486a4b8d5af-certs" seLinuxMountContext="" Mar 19 11:59:26.462157 master-0 kubenswrapper[17644]: I0319 11:59:26.462121 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1" volumeName="kubernetes.io/projected/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-bound-sa-token" seLinuxMountContext="" Mar 19 11:59:26.462157 master-0 kubenswrapper[17644]: I0319 11:59:26.462131 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" volumeName="kubernetes.io/projected/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-kube-api-access-lqcvx" seLinuxMountContext="" Mar 19 11:59:26.462157 master-0 kubenswrapper[17644]: I0319 11:59:26.462145 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f5d73fef-1414-4b29-97ea-42e1c0b1ef18" volumeName="kubernetes.io/configmap/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-config" seLinuxMountContext="" Mar 19 11:59:26.462157 master-0 kubenswrapper[17644]: I0319 11:59:26.462154 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="00dd3703-af25-4e71-b20b-b3e153383489" volumeName="kubernetes.io/empty-dir/00dd3703-af25-4e71-b20b-b3e153383489-catalog-content" seLinuxMountContext="" Mar 19 11:59:26.462157 master-0 kubenswrapper[17644]: I0319 11:59:26.462164 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c" volumeName="kubernetes.io/projected/1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c-kube-api-access-2mxjl" seLinuxMountContext="" Mar 19 11:59:26.462157 master-0 kubenswrapper[17644]: I0319 11:59:26.462177 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b5c7eb66-e23e-40df-883c-fed012c07f26" volumeName="kubernetes.io/secret/b5c7eb66-e23e-40df-883c-fed012c07f26-proxy-tls" seLinuxMountContext="" Mar 19 11:59:26.462157 master-0 kubenswrapper[17644]: I0319 11:59:26.462186 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d06b230b-db67-4afc-8d10-2c33ad568462" volumeName="kubernetes.io/secret/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-tls" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462199 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dedf55c4-eeda-4955-aafe-db1fdb8c4a48" volumeName="kubernetes.io/secret/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-openshift-state-metrics-tls" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462209 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e2ad29ad-70ef-43c6-91f6-02f04d145673" volumeName="kubernetes.io/secret/e2ad29ad-70ef-43c6-91f6-02f04d145673-stats-auth" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462222 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e45616db-f7dd-4a08-847f-abf2759d9fa4" volumeName="kubernetes.io/secret/e45616db-f7dd-4a08-847f-abf2759d9fa4-etcd-client" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462231 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1b94d1eb-1b80-4a14-b1c0-d9e192231352" volumeName="kubernetes.io/empty-dir/1b94d1eb-1b80-4a14-b1c0-d9e192231352-cache" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462268 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ac09dba7-398c-4b0a-a415-edb73cb4cf30" volumeName="kubernetes.io/configmap/ac09dba7-398c-4b0a-a415-edb73cb4cf30-auth-proxy-config" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462279 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a3ceeece-bee9-4fcb-8517-95ebce38e223" volumeName="kubernetes.io/projected/a3ceeece-bee9-4fcb-8517-95ebce38e223-kube-api-access-zrgqb" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462295 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e48b5aa9-293e-4222-91ff-7640addeca4c" volumeName="kubernetes.io/secret/e48b5aa9-293e-4222-91ff-7640addeca4c-etcd-client" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462359 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="034cad93-a500-4c58-8d97-fa49866a0d5e" volumeName="kubernetes.io/configmap/034cad93-a500-4c58-8d97-fa49866a0d5e-service-ca-bundle" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462369 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1b94d1eb-1b80-4a14-b1c0-d9e192231352" volumeName="kubernetes.io/projected/1b94d1eb-1b80-4a14-b1c0-d9e192231352-ca-certs" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462378 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d1eef757-d63a-4708-8efe-7b27eea1ff63" volumeName="kubernetes.io/empty-dir/d1eef757-d63a-4708-8efe-7b27eea1ff63-catalog-content" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462389 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="732989c5-1b89-46f0-9917-b68613f7f005" volumeName="kubernetes.io/secret/732989c5-1b89-46f0-9917-b68613f7f005-serving-cert" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462420 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cf6aab0e-defc-4a4b-8a07-f5af8bf177c4" volumeName="kubernetes.io/projected/cf6aab0e-defc-4a4b-8a07-f5af8bf177c4-kube-api-access-894bt" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462429 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6d41245b-33d4-40f8-bbe1-6d2247e2e335" volumeName="kubernetes.io/projected/6d41245b-33d4-40f8-bbe1-6d2247e2e335-kube-api-access-k7bq7" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462439 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cf6aab0e-defc-4a4b-8a07-f5af8bf177c4" volumeName="kubernetes.io/empty-dir/cf6aab0e-defc-4a4b-8a07-f5af8bf177c4-catalog-content" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462449 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e2ad29ad-70ef-43c6-91f6-02f04d145673" volumeName="kubernetes.io/secret/e2ad29ad-70ef-43c6-91f6-02f04d145673-default-certificate" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462459 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2292109e-92a9-4286-858e-dcd2ac083c43" volumeName="kubernetes.io/projected/2292109e-92a9-4286-858e-dcd2ac083c43-kube-api-access-8rt57" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462468 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6d41245b-33d4-40f8-bbe1-6d2247e2e335" volumeName="kubernetes.io/empty-dir/6d41245b-33d4-40f8-bbe1-6d2247e2e335-tmpfs" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462478 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0cbbe8d0-aafb-499f-a1f4-affcea62c1ab" volumeName="kubernetes.io/projected/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab-kube-api-access-8v9bx" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462488 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e45616db-f7dd-4a08-847f-abf2759d9fa4" volumeName="kubernetes.io/secret/e45616db-f7dd-4a08-847f-abf2759d9fa4-serving-cert" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462497 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b3de8a1b-a5be-414f-86e8-738e16c8bc97" volumeName="kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462508 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b5c7eb66-e23e-40df-883c-fed012c07f26" volumeName="kubernetes.io/configmap/b5c7eb66-e23e-40df-883c-fed012c07f26-images" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462518 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103" volumeName="kubernetes.io/empty-dir/cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103-utilities" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462529 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d625c81e-01cc-424a-997d-546a5204a72b" volumeName="kubernetes.io/projected/d625c81e-01cc-424a-997d-546a5204a72b-kube-api-access-tgzdh" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462539 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="163d6a3d-0080-4122-bb7a-17f6e63f00f0" volumeName="kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462548 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8fe4839d-cef4-4ec9-b146-2ae9b76d8a76" volumeName="kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-config" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462560 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dd6ec279-d92f-45c2-97c2-88b96fbd6600" volumeName="kubernetes.io/projected/dd6ec279-d92f-45c2-97c2-88b96fbd6600-kube-api-access" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462570 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f8c022c-7871-4765-971f-dcafa39357c9" volumeName="kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-client-ca-bundle" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462582 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1" volumeName="kubernetes.io/projected/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-kube-api-access-46m89" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462593 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="12809811-c9df-4e77-8c12-309831b8975d" volumeName="kubernetes.io/projected/12809811-c9df-4e77-8c12-309831b8975d-kube-api-access-bdx6s" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462603 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="75aedbcd-f6ed-43a1-941b-4b04887ffe8e" volumeName="kubernetes.io/configmap/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-images" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462622 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c" volumeName="kubernetes.io/configmap/1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c-config-volume" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462634 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2d63d5a8-f45d-4678-824d-5534b2bcd6ca" volumeName="kubernetes.io/configmap/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-metrics-client-ca" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462646 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76cf2b01-33d9-47eb-be5d-44946c78bf20" volumeName="kubernetes.io/projected/76cf2b01-33d9-47eb-be5d-44946c78bf20-kube-api-access-nj527" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462658 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="00dd3703-af25-4e71-b20b-b3e153383489" volumeName="kubernetes.io/empty-dir/00dd3703-af25-4e71-b20b-b3e153383489-utilities" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462671 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0cbbe8d0-aafb-499f-a1f4-affcea62c1ab" volumeName="kubernetes.io/secret/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab-cloud-credential-operator-serving-cert" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462681 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9b61ea14-a7ea-49f3-9df4-5655765ddf7c" volumeName="kubernetes.io/configmap/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-config" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462691 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cf08ab4f-c203-4c16-9826-8cc049f4af31" volumeName="kubernetes.io/projected/cf08ab4f-c203-4c16-9826-8cc049f4af31-kube-api-access-lkm97" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462701 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dbcbba74-ac53-4724-a217-4d9b85e7c1db" volumeName="kubernetes.io/projected/dbcbba74-ac53-4724-a217-4d9b85e7c1db-kube-api-access" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462715 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22e10648-af7c-409e-b947-570e7d807e05" volumeName="kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462770 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8fe4839d-cef4-4ec9-b146-2ae9b76d8a76" volumeName="kubernetes.io/secret/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-serving-cert" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462781 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6611e325-6152-480c-9c2c-1b503e49ccd2" volumeName="kubernetes.io/empty-dir/6611e325-6152-480c-9c2c-1b503e49ccd2-operand-assets" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462790 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8438d015-106b-4aed-ae12-dda781ce51fc" volumeName="kubernetes.io/configmap/8438d015-106b-4aed-ae12-dda781ce51fc-ovnkube-identity-cm" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462799 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dbcbba74-ac53-4724-a217-4d9b85e7c1db" volumeName="kubernetes.io/configmap/dbcbba74-ac53-4724-a217-4d9b85e7c1db-config" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462811 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="034cad93-a500-4c58-8d97-fa49866a0d5e" volumeName="kubernetes.io/projected/034cad93-a500-4c58-8d97-fa49866a0d5e-kube-api-access-jptl6" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462820 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f8c022c-7871-4765-971f-dcafa39357c9" volumeName="kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-metrics-server-audit-profiles" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462832 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8fe4839d-cef4-4ec9-b146-2ae9b76d8a76" volumeName="kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-service-ca" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462844 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ac09dba7-398c-4b0a-a415-edb73cb4cf30" volumeName="kubernetes.io/projected/ac09dba7-398c-4b0a-a415-edb73cb4cf30-kube-api-access-pbhv4" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462855 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85" volumeName="kubernetes.io/projected/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-kube-api-access-n5skx" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462866 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="66f88242-8b0b-4790-bbb6-445c19b34ee7" volumeName="kubernetes.io/projected/66f88242-8b0b-4790-bbb6-445c19b34ee7-kube-api-access-p5fnx" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462877 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6e76fc3f-39a4-4f99-8603-38a94da6ea8e" volumeName="kubernetes.io/configmap/6e76fc3f-39a4-4f99-8603-38a94da6ea8e-signing-cabundle" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462890 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="034cad93-a500-4c58-8d97-fa49866a0d5e" volumeName="kubernetes.io/secret/034cad93-a500-4c58-8d97-fa49866a0d5e-serving-cert" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462901 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2d63d5a8-f45d-4678-824d-5534b2bcd6ca" volumeName="kubernetes.io/configmap/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-custom-resource-state-configmap" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462914 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f4aad0ff-e6cd-4c43-9561-80a14fee4712" volumeName="kubernetes.io/configmap/f4aad0ff-e6cd-4c43-9561-80a14fee4712-metrics-client-ca" seLinuxMountContext="" Mar 19 11:59:26.462833 master-0 kubenswrapper[17644]: I0319 11:59:26.462927 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="12809811-c9df-4e77-8c12-309831b8975d" volumeName="kubernetes.io/secret/12809811-c9df-4e77-8c12-309831b8975d-proxy-tls" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.462941 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="376b18a9-5f33-44fd-a37b-20ab02c5e65d" volumeName="kubernetes.io/projected/376b18a9-5f33-44fd-a37b-20ab02c5e65d-kube-api-access-f2hrw" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.462955 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e65e2a2f-16b5-44a3-9860-741f70188ab5" volumeName="kubernetes.io/projected/e65e2a2f-16b5-44a3-9860-741f70188ab5-kube-api-access-4fvvj" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.462967 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" volumeName="kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cni-binary-copy" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.462980 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c8d8a09f-22d5-4f16-84d6-d5f2c504c949" volumeName="kubernetes.io/projected/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-kube-api-access-p5jsb" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.462991 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6611e325-6152-480c-9c2c-1b503e49ccd2" volumeName="kubernetes.io/secret/6611e325-6152-480c-9c2c-1b503e49ccd2-cluster-olm-operator-serving-cert" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463004 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f29b11ce-60e0-46b3-8d28-eea3452513cd" volumeName="kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463016 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103" volumeName="kubernetes.io/projected/cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103-kube-api-access-hg6sp" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463027 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e48b5aa9-293e-4222-91ff-7640addeca4c" volumeName="kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-audit" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463046 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e48b5aa9-293e-4222-91ff-7640addeca4c" volumeName="kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-config" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463073 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3053504d-0734-4def-b639-0f5cc2178185" volumeName="kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-ovnkube-config" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463085 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="52bdf7cc-f07d-487e-937c-6567f194947e" volumeName="kubernetes.io/secret/52bdf7cc-f07d-487e-937c-6567f194947e-cluster-storage-operator-serving-cert" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463099 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bb22a965-9b36-40cd-993d-747a3978be8e" volumeName="kubernetes.io/projected/bb22a965-9b36-40cd-993d-747a3978be8e-kube-api-access-5p55f" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463110 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c8d8a09f-22d5-4f16-84d6-d5f2c504c949" volumeName="kubernetes.io/configmap/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-images" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463122 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e45616db-f7dd-4a08-847f-abf2759d9fa4" volumeName="kubernetes.io/secret/e45616db-f7dd-4a08-847f-abf2759d9fa4-encryption-config" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463133 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="163d6a3d-0080-4122-bb7a-17f6e63f00f0" volumeName="kubernetes.io/projected/163d6a3d-0080-4122-bb7a-17f6e63f00f0-kube-api-access-m7tc5" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463146 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24f71770-714e-4111-9188-ad8663c6baa7" volumeName="kubernetes.io/projected/24f71770-714e-4111-9188-ad8663c6baa7-kube-api-access-m287x" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463157 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf" volumeName="kubernetes.io/configmap/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-config" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463175 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f29b11ce-60e0-46b3-8d28-eea3452513cd" volumeName="kubernetes.io/projected/f29b11ce-60e0-46b3-8d28-eea3452513cd-kube-api-access-bgs4l" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463185 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e45616db-f7dd-4a08-847f-abf2759d9fa4" volumeName="kubernetes.io/configmap/e45616db-f7dd-4a08-847f-abf2759d9fa4-etcd-serving-ca" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463199 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09a22c25-6073-4b1a-a029-928452ef37db" volumeName="kubernetes.io/configmap/09a22c25-6073-4b1a-a029-928452ef37db-cni-binary-copy" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463210 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9" volumeName="kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463226 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="75aedbcd-f6ed-43a1-941b-4b04887ffe8e" volumeName="kubernetes.io/projected/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-kube-api-access-dd6rv" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463237 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b5c7eb66-e23e-40df-883c-fed012c07f26" volumeName="kubernetes.io/projected/b5c7eb66-e23e-40df-883c-fed012c07f26-kube-api-access-tx487" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463249 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f4aad0ff-e6cd-4c43-9561-80a14fee4712" volumeName="kubernetes.io/secret/f4aad0ff-e6cd-4c43-9561-80a14fee4712-prometheus-operator-kube-rbac-proxy-config" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463262 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3c3b0d24-ce5e-49c3-a546-874356f75dc6" volumeName="kubernetes.io/projected/3c3b0d24-ce5e-49c3-a546-874356f75dc6-kube-api-access-pngsr" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463274 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4e2c195f-e97d-4cac-81fc-2d5c551d1c30" volumeName="kubernetes.io/projected/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-kube-api-access-kgz7q" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463288 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="daf4dbb6-5a0a-4c92-a930-479a7330ace1" volumeName="kubernetes.io/projected/daf4dbb6-5a0a-4c92-a930-479a7330ace1-kube-api-access-72jlb" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463300 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24f71770-714e-4111-9188-ad8663c6baa7" volumeName="kubernetes.io/secret/24f71770-714e-4111-9188-ad8663c6baa7-proxy-tls" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463334 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc" volumeName="kubernetes.io/configmap/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-config" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463347 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f8c022c-7871-4765-971f-dcafa39357c9" volumeName="kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-client-certs" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463361 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="732989c5-1b89-46f0-9917-b68613f7f005" volumeName="kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-config" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463487 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7a51eeaf-1349-4bf3-932d-22ed5ce7c161" volumeName="kubernetes.io/projected/7a51eeaf-1349-4bf3-932d-22ed5ce7c161-kube-api-access-cfxw7" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463502 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8438d015-106b-4aed-ae12-dda781ce51fc" volumeName="kubernetes.io/secret/8438d015-106b-4aed-ae12-dda781ce51fc-webhook-cert" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463517 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aaaaf539-bf61-44d7-8d47-97535b7aa1ba" volumeName="kubernetes.io/configmap/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-trusted-ca" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463530 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aaaaf539-bf61-44d7-8d47-97535b7aa1ba" volumeName="kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463542 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="39d3ac31-9259-454b-8e1c-e23024f8f2b2" volumeName="kubernetes.io/projected/39d3ac31-9259-454b-8e1c-e23024f8f2b2-kube-api-access" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463559 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f8c022c-7871-4765-971f-dcafa39357c9" volumeName="kubernetes.io/projected/5f8c022c-7871-4765-971f-dcafa39357c9-kube-api-access-g997b" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463591 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f3b6a8b5-bcaa-47f6-a9d5-6186981191d5" volumeName="kubernetes.io/projected/f3b6a8b5-bcaa-47f6-a9d5-6186981191d5-kube-api-access-jdbjk" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463605 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="92e401a4-ed2f-46f7-924b-329d7b313e6a" volumeName="kubernetes.io/configmap/92e401a4-ed2f-46f7-924b-329d7b313e6a-images" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463621 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4e2c195f-e97d-4cac-81fc-2d5c551d1c30" volumeName="kubernetes.io/configmap/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-iptables-alerter-script" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463634 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7a51eeaf-1349-4bf3-932d-22ed5ce7c161" volumeName="kubernetes.io/secret/7a51eeaf-1349-4bf3-932d-22ed5ce7c161-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463648 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="75aedbcd-f6ed-43a1-941b-4b04887ffe8e" volumeName="kubernetes.io/configmap/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-config" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463665 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="89cf2ee8-3664-4502-b70c-b7e0a5e92cb7" volumeName="kubernetes.io/projected/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-kube-api-access-7spvn" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463681 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8fe4839d-cef4-4ec9-b146-2ae9b76d8a76" volumeName="kubernetes.io/secret/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-client" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463691 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc" volumeName="kubernetes.io/projected/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-kube-api-access-dr788" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463701 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="daf4dbb6-5a0a-4c92-a930-479a7330ace1" volumeName="kubernetes.io/configmap/daf4dbb6-5a0a-4c92-a930-479a7330ace1-ovnkube-config" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463713 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3053504d-0734-4def-b639-0f5cc2178185" volumeName="kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-ovnkube-script-lib" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463742 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3053504d-0734-4def-b639-0f5cc2178185" volumeName="kubernetes.io/projected/3053504d-0734-4def-b639-0f5cc2178185-kube-api-access-2bb2x" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463754 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b5c7eb66-e23e-40df-883c-fed012c07f26" volumeName="kubernetes.io/configmap/b5c7eb66-e23e-40df-883c-fed012c07f26-auth-proxy-config" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463765 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="163d6a3d-0080-4122-bb7a-17f6e63f00f0" volumeName="kubernetes.io/projected/163d6a3d-0080-4122-bb7a-17f6e63f00f0-bound-sa-token" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463775 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9b61ea14-a7ea-49f3-9df4-5655765ddf7c" volumeName="kubernetes.io/secret/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-serving-cert" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463786 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d" volumeName="kubernetes.io/secret/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-serving-cert" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463796 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dedf55c4-eeda-4955-aafe-db1fdb8c4a48" volumeName="kubernetes.io/configmap/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-metrics-client-ca" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463859 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0cbbe8d0-aafb-499f-a1f4-affcea62c1ab" volumeName="kubernetes.io/configmap/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab-cco-trusted-ca" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463870 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76cf2b01-33d9-47eb-be5d-44946c78bf20" volumeName="kubernetes.io/secret/76cf2b01-33d9-47eb-be5d-44946c78bf20-serving-cert" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463881 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d06b230b-db67-4afc-8d10-2c33ad568462" volumeName="kubernetes.io/secret/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-kube-rbac-proxy-config" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463892 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="daf4dbb6-5a0a-4c92-a930-479a7330ace1" volumeName="kubernetes.io/secret/daf4dbb6-5a0a-4c92-a930-479a7330ace1-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463903 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e48b5aa9-293e-4222-91ff-7640addeca4c" volumeName="kubernetes.io/projected/e48b5aa9-293e-4222-91ff-7640addeca4c-kube-api-access-88ghj" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463915 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a3ceeece-bee9-4fcb-8517-95ebce38e223" volumeName="kubernetes.io/empty-dir/a3ceeece-bee9-4fcb-8517-95ebce38e223-available-featuregates" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463929 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c8d8a09f-22d5-4f16-84d6-d5f2c504c949" volumeName="kubernetes.io/configmap/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-auth-proxy-config" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463940 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="034cad93-a500-4c58-8d97-fa49866a0d5e" volumeName="kubernetes.io/configmap/034cad93-a500-4c58-8d97-fa49866a0d5e-trusted-ca-bundle" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463950 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc" volumeName="kubernetes.io/configmap/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-auth-proxy-config" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463964 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aaaaf539-bf61-44d7-8d47-97535b7aa1ba" volumeName="kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463975 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dedf55c4-eeda-4955-aafe-db1fdb8c4a48" volumeName="kubernetes.io/projected/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-kube-api-access-lscpq" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.463989 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e2ad29ad-70ef-43c6-91f6-02f04d145673" volumeName="kubernetes.io/secret/e2ad29ad-70ef-43c6-91f6-02f04d145673-metrics-certs" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464001 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e45616db-f7dd-4a08-847f-abf2759d9fa4" volumeName="kubernetes.io/configmap/e45616db-f7dd-4a08-847f-abf2759d9fa4-audit-policies" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464013 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f5d73fef-1414-4b29-97ea-42e1c0b1ef18" volumeName="kubernetes.io/projected/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-kube-api-access-v27lg" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464024 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3c3b0d24-ce5e-49c3-a546-874356f75dc6" volumeName="kubernetes.io/secret/3c3b0d24-ce5e-49c3-a546-874356f75dc6-metrics-tls" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464034 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc" volumeName="kubernetes.io/secret/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-machine-approver-tls" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464047 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cf08ab4f-c203-4c16-9826-8cc049f4af31" volumeName="kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464058 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="716c2176-50f9-4c4f-af0e-4c7973457df2" volumeName="kubernetes.io/projected/716c2176-50f9-4c4f-af0e-4c7973457df2-kube-api-access-m8bmw" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464073 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76cf2b01-33d9-47eb-be5d-44946c78bf20" volumeName="kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-client-ca" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464084 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8376e1f9-ab05-42d4-aa66-284a167a9bfc" volumeName="kubernetes.io/projected/8376e1f9-ab05-42d4-aa66-284a167a9bfc-kube-api-access-n7784" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464093 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="92e401a4-ed2f-46f7-924b-329d7b313e6a" volumeName="kubernetes.io/secret/92e401a4-ed2f-46f7-924b-329d7b313e6a-cert" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464103 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dd6ec279-d92f-45c2-97c2-88b96fbd6600" volumeName="kubernetes.io/configmap/dd6ec279-d92f-45c2-97c2-88b96fbd6600-service-ca" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464113 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22e10648-af7c-409e-b947-570e7d807e05" volumeName="kubernetes.io/projected/22e10648-af7c-409e-b947-570e7d807e05-kube-api-access-wls49" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464122 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="732989c5-1b89-46f0-9917-b68613f7f005" volumeName="kubernetes.io/projected/732989c5-1b89-46f0-9917-b68613f7f005-kube-api-access-bfvz6" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464132 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e2ad29ad-70ef-43c6-91f6-02f04d145673" volumeName="kubernetes.io/projected/e2ad29ad-70ef-43c6-91f6-02f04d145673-kube-api-access-trcb7" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464468 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e48b5aa9-293e-4222-91ff-7640addeca4c" volumeName="kubernetes.io/secret/e48b5aa9-293e-4222-91ff-7640addeca4c-encryption-config" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464483 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6d41245b-33d4-40f8-bbe1-6d2247e2e335" volumeName="kubernetes.io/secret/6d41245b-33d4-40f8-bbe1-6d2247e2e335-webhook-cert" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464494 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6e76fc3f-39a4-4f99-8603-38a94da6ea8e" volumeName="kubernetes.io/projected/6e76fc3f-39a4-4f99-8603-38a94da6ea8e-kube-api-access-5th4l" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464503 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8fe4839d-cef4-4ec9-b146-2ae9b76d8a76" volumeName="kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-ca" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464514 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" volumeName="kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cni-sysctl-allowlist" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464524 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bb22a965-9b36-40cd-993d-747a3978be8e" volumeName="kubernetes.io/secret/bb22a965-9b36-40cd-993d-747a3978be8e-samples-operator-tls" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464534 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e48b5aa9-293e-4222-91ff-7640addeca4c" volumeName="kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-etcd-serving-ca" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464546 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3053504d-0734-4def-b639-0f5cc2178185" volumeName="kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-env-overrides" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464556 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="66f88242-8b0b-4790-bbb6-445c19b34ee7" volumeName="kubernetes.io/secret/66f88242-8b0b-4790-bbb6-445c19b34ee7-serving-cert" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464566 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" volumeName="kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-whereabouts-flatfile-configmap" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464575 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1b94d1eb-1b80-4a14-b1c0-d9e192231352" volumeName="kubernetes.io/projected/1b94d1eb-1b80-4a14-b1c0-d9e192231352-kube-api-access-srlcl" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464587 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2d63d5a8-f45d-4678-824d-5534b2bcd6ca" volumeName="kubernetes.io/projected/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-api-access-kwrd5" seLinuxMountContext="" Mar 19 11:59:26.464775 master-0 kubenswrapper[17644]: I0319 11:59:26.464597 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f8c022c-7871-4765-971f-dcafa39357c9" volumeName="kubernetes.io/empty-dir/5f8c022c-7871-4765-971f-dcafa39357c9-audit-log" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465463 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2d63d5a8-f45d-4678-824d-5534b2bcd6ca" volumeName="kubernetes.io/secret/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-tls" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465482 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3053504d-0734-4def-b639-0f5cc2178185" volumeName="kubernetes.io/secret/3053504d-0734-4def-b639-0f5cc2178185-ovn-node-metrics-cert" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465493 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1" volumeName="kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465504 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d" volumeName="kubernetes.io/configmap/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-config" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465514 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e48b5aa9-293e-4222-91ff-7640addeca4c" volumeName="kubernetes.io/secret/e48b5aa9-293e-4222-91ff-7640addeca4c-serving-cert" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465524 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24f71770-714e-4111-9188-ad8663c6baa7" volumeName="kubernetes.io/configmap/24f71770-714e-4111-9188-ad8663c6baa7-mcd-auth-proxy-config" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465532 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="376b18a9-5f33-44fd-a37b-20ab02c5e65d" volumeName="kubernetes.io/projected/376b18a9-5f33-44fd-a37b-20ab02c5e65d-ca-certs" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465542 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf" volumeName="kubernetes.io/secret/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-serving-cert" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465571 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a3ceeece-bee9-4fcb-8517-95ebce38e223" volumeName="kubernetes.io/secret/a3ceeece-bee9-4fcb-8517-95ebce38e223-serving-cert" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465582 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ac09dba7-398c-4b0a-a415-edb73cb4cf30" volumeName="kubernetes.io/secret/ac09dba7-398c-4b0a-a415-edb73cb4cf30-cert" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465594 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dedf55c4-eeda-4955-aafe-db1fdb8c4a48" volumeName="kubernetes.io/secret/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-openshift-state-metrics-kube-rbac-proxy-config" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465605 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="732989c5-1b89-46f0-9917-b68613f7f005" volumeName="kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-service-ca-bundle" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465614 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d06b230b-db67-4afc-8d10-2c33ad568462" volumeName="kubernetes.io/empty-dir/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-textfile" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465641 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="732989c5-1b89-46f0-9917-b68613f7f005" volumeName="kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-trusted-ca-bundle" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465652 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d" volumeName="kubernetes.io/projected/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-kube-api-access-qql5t" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465664 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="39d3ac31-9259-454b-8e1c-e23024f8f2b2" volumeName="kubernetes.io/configmap/39d3ac31-9259-454b-8e1c-e23024f8f2b2-config" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465705 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="39d3ac31-9259-454b-8e1c-e23024f8f2b2" volumeName="kubernetes.io/secret/39d3ac31-9259-454b-8e1c-e23024f8f2b2-serving-cert" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465714 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cf6aab0e-defc-4a4b-8a07-f5af8bf177c4" volumeName="kubernetes.io/empty-dir/cf6aab0e-defc-4a4b-8a07-f5af8bf177c4-utilities" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465724 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f8c022c-7871-4765-971f-dcafa39357c9" volumeName="kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-server-tls" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465748 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c8d8a09f-22d5-4f16-84d6-d5f2c504c949" volumeName="kubernetes.io/secret/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-cloud-controller-manager-operator-tls" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465757 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="66f88242-8b0b-4790-bbb6-445c19b34ee7" volumeName="kubernetes.io/configmap/66f88242-8b0b-4790-bbb6-445c19b34ee7-config" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465767 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d06b230b-db67-4afc-8d10-2c33ad568462" volumeName="kubernetes.io/projected/d06b230b-db67-4afc-8d10-2c33ad568462-kube-api-access-4bbtl" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465881 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="92e401a4-ed2f-46f7-924b-329d7b313e6a" volumeName="kubernetes.io/configmap/92e401a4-ed2f-46f7-924b-329d7b313e6a-config" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465895 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1" volumeName="kubernetes.io/configmap/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-trusted-ca" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465903 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b3de8a1b-a5be-414f-86e8-738e16c8bc97" volumeName="kubernetes.io/configmap/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-trusted-ca" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465916 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103" volumeName="kubernetes.io/empty-dir/cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103-catalog-content" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465925 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d1eef757-d63a-4708-8efe-7b27eea1ff63" volumeName="kubernetes.io/empty-dir/d1eef757-d63a-4708-8efe-7b27eea1ff63-utilities" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465964 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f4aad0ff-e6cd-4c43-9561-80a14fee4712" volumeName="kubernetes.io/projected/f4aad0ff-e6cd-4c43-9561-80a14fee4712-kube-api-access-zndqq" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465974 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2d63d5a8-f45d-4678-824d-5534b2bcd6ca" volumeName="kubernetes.io/secret/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-kube-rbac-proxy-config" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465985 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8376e1f9-ab05-42d4-aa66-284a167a9bfc" volumeName="kubernetes.io/empty-dir/8376e1f9-ab05-42d4-aa66-284a167a9bfc-tmp" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.465993 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f4aad0ff-e6cd-4c43-9561-80a14fee4712" volumeName="kubernetes.io/secret/f4aad0ff-e6cd-4c43-9561-80a14fee4712-prometheus-operator-tls" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466002 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b3de8a1b-a5be-414f-86e8-738e16c8bc97" volumeName="kubernetes.io/projected/b3de8a1b-a5be-414f-86e8-738e16c8bc97-kube-api-access-nlr9q" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466014 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="daf4dbb6-5a0a-4c92-a930-479a7330ace1" volumeName="kubernetes.io/configmap/daf4dbb6-5a0a-4c92-a930-479a7330ace1-env-overrides" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466024 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf" volumeName="kubernetes.io/projected/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-kube-api-access-nds54" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466033 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1c898657-f06b-44ab-95ff-53a324759ba1" volumeName="kubernetes.io/projected/1c898657-f06b-44ab-95ff-53a324759ba1-kube-api-access-mt6bf" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466042 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9b61ea14-a7ea-49f3-9df4-5655765ddf7c" volumeName="kubernetes.io/projected/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-kube-api-access" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466053 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e5078f17-bc65-460f-9f18-8c506db6840b" volumeName="kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466063 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f5d73fef-1414-4b29-97ea-42e1c0b1ef18" volumeName="kubernetes.io/secret/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-serving-cert" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466073 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2d63d5a8-f45d-4678-824d-5534b2bcd6ca" volumeName="kubernetes.io/empty-dir/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-volume-directive-shadow" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466082 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9" volumeName="kubernetes.io/configmap/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-telemetry-config" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466091 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d1eef757-d63a-4708-8efe-7b27eea1ff63" volumeName="kubernetes.io/projected/d1eef757-d63a-4708-8efe-7b27eea1ff63-kube-api-access-kbq7n" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466107 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dd6ec279-d92f-45c2-97c2-88b96fbd6600" volumeName="kubernetes.io/secret/dd6ec279-d92f-45c2-97c2-88b96fbd6600-serving-cert" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466119 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e5078f17-bc65-460f-9f18-8c506db6840b" volumeName="kubernetes.io/projected/e5078f17-bc65-460f-9f18-8c506db6840b-kube-api-access-s5rm4" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466131 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6870ccc7-2094-48d8-9238-f486a4b8d5af" volumeName="kubernetes.io/secret/6870ccc7-2094-48d8-9238-f486a4b8d5af-node-bootstrap-token" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466142 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8fe4839d-cef4-4ec9-b146-2ae9b76d8a76" volumeName="kubernetes.io/projected/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-kube-api-access-dnl28" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466151 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e45616db-f7dd-4a08-847f-abf2759d9fa4" volumeName="kubernetes.io/configmap/e45616db-f7dd-4a08-847f-abf2759d9fa4-trusted-ca-bundle" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466161 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85" volumeName="kubernetes.io/configmap/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-config" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466171 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="92e401a4-ed2f-46f7-924b-329d7b313e6a" volumeName="kubernetes.io/projected/92e401a4-ed2f-46f7-924b-329d7b313e6a-kube-api-access-c7nhq" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466181 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="92e401a4-ed2f-46f7-924b-329d7b313e6a" volumeName="kubernetes.io/secret/92e401a4-ed2f-46f7-924b-329d7b313e6a-cluster-baremetal-operator-tls" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466199 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6e76fc3f-39a4-4f99-8603-38a94da6ea8e" volumeName="kubernetes.io/secret/6e76fc3f-39a4-4f99-8603-38a94da6ea8e-signing-key" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466209 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8376e1f9-ab05-42d4-aa66-284a167a9bfc" volumeName="kubernetes.io/empty-dir/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-tuned" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466219 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9778f8f5-b0d1-4967-9776-9db758bba3af" volumeName="kubernetes.io/secret/9778f8f5-b0d1-4967-9776-9db758bba3af-tls-certificates" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466229 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="034cad93-a500-4c58-8d97-fa49866a0d5e" volumeName="kubernetes.io/empty-dir/034cad93-a500-4c58-8d97-fa49866a0d5e-snapshots" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466239 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="12809811-c9df-4e77-8c12-309831b8975d" volumeName="kubernetes.io/configmap/12809811-c9df-4e77-8c12-309831b8975d-mcc-auth-proxy-config" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466248 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6230ed8f-4608-4168-8f5a-656f411b0ef7" volumeName="kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466258 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="376b18a9-5f33-44fd-a37b-20ab02c5e65d" volumeName="kubernetes.io/secret/376b18a9-5f33-44fd-a37b-20ab02c5e65d-catalogserver-certs" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466268 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6870ccc7-2094-48d8-9238-f486a4b8d5af" volumeName="kubernetes.io/projected/6870ccc7-2094-48d8-9238-f486a4b8d5af-kube-api-access-9dg9r" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466279 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6d41245b-33d4-40f8-bbe1-6d2247e2e335" volumeName="kubernetes.io/secret/6d41245b-33d4-40f8-bbe1-6d2247e2e335-apiservice-cert" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466290 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aaaaf539-bf61-44d7-8d47-97535b7aa1ba" volumeName="kubernetes.io/projected/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-kube-api-access-7nfnb" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466299 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09a22c25-6073-4b1a-a029-928452ef37db" volumeName="kubernetes.io/projected/09a22c25-6073-4b1a-a029-928452ef37db-kube-api-access-xx4wk" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466309 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c" volumeName="kubernetes.io/secret/1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c-metrics-tls" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466321 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85" volumeName="kubernetes.io/secret/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-serving-cert" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466330 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="163d6a3d-0080-4122-bb7a-17f6e63f00f0" volumeName="kubernetes.io/configmap/163d6a3d-0080-4122-bb7a-17f6e63f00f0-trusted-ca" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466338 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dbcbba74-ac53-4724-a217-4d9b85e7c1db" volumeName="kubernetes.io/secret/dbcbba74-ac53-4724-a217-4d9b85e7c1db-serving-cert" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466349 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76cf2b01-33d9-47eb-be5d-44946c78bf20" volumeName="kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-config" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466359 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8438d015-106b-4aed-ae12-dda781ce51fc" volumeName="kubernetes.io/projected/8438d015-106b-4aed-ae12-dda781ce51fc-kube-api-access-cqr6w" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466367 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e45616db-f7dd-4a08-847f-abf2759d9fa4" volumeName="kubernetes.io/projected/e45616db-f7dd-4a08-847f-abf2759d9fa4-kube-api-access-dvkxx" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466376 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e48b5aa9-293e-4222-91ff-7640addeca4c" volumeName="kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-image-import-ca" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466388 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6611e325-6152-480c-9c2c-1b503e49ccd2" volumeName="kubernetes.io/projected/6611e325-6152-480c-9c2c-1b503e49ccd2-kube-api-access-4p4hg" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466397 17644 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9" volumeName="kubernetes.io/projected/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-kube-api-access-79qrb" seLinuxMountContext="" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466405 17644 reconstruct.go:97] "Volume reconstruction finished" Mar 19 11:59:26.467794 master-0 kubenswrapper[17644]: I0319 11:59:26.466413 17644 reconciler.go:26] "Reconciler: start to sync state" Mar 19 11:59:26.481945 master-0 kubenswrapper[17644]: I0319 11:59:26.480414 17644 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 19 11:59:26.483843 master-0 kubenswrapper[17644]: I0319 11:59:26.482296 17644 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 19 11:59:26.483843 master-0 kubenswrapper[17644]: I0319 11:59:26.482351 17644 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 19 11:59:26.483843 master-0 kubenswrapper[17644]: I0319 11:59:26.482377 17644 kubelet.go:2335] "Starting kubelet main sync loop" Mar 19 11:59:26.483843 master-0 kubenswrapper[17644]: E0319 11:59:26.482441 17644 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 19 11:59:26.525413 master-0 kubenswrapper[17644]: I0319 11:59:26.525349 17644 generic.go:334] "Generic (PLEG): container finished" podID="8fe4839d-cef4-4ec9-b146-2ae9b76d8a76" containerID="b1921d5234eb4af4d7731c20be87a9595434841b33d272f8f2c3ade584fe4c62" exitCode=0 Mar 19 11:59:26.528467 master-0 kubenswrapper[17644]: I0319 11:59:26.528433 17644 generic.go:334] "Generic (PLEG): container finished" podID="f5d73fef-1414-4b29-97ea-42e1c0b1ef18" containerID="a00e4976297d868e9d1a74ee69351e1ac6225f1b3fff400804a95076bf8deddd" exitCode=0 Mar 19 11:59:26.534293 master-0 kubenswrapper[17644]: I0319 11:59:26.534178 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-764k4_d625c81e-01cc-424a-997d-546a5204a72b/snapshot-controller/0.log" Mar 19 11:59:26.534293 master-0 kubenswrapper[17644]: I0319 11:59:26.534224 17644 generic.go:334] "Generic (PLEG): container finished" podID="d625c81e-01cc-424a-997d-546a5204a72b" containerID="e21a965ed4cb2dd18edb22058723998ac546681c370497fc8735a2d87bc17971" exitCode=1 Mar 19 11:59:26.539539 master-0 kubenswrapper[17644]: I0319 11:59:26.539198 17644 generic.go:334] "Generic (PLEG): container finished" podID="732989c5-1b89-46f0-9917-b68613f7f005" containerID="4ee16bcaa03f25cf971556786ccb51f285719b794843e45ad52bd8134e676a54" exitCode=0 Mar 19 11:59:26.550685 master-0 kubenswrapper[17644]: E0319 11:59:26.550618 17644 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:59:26.582808 master-0 kubenswrapper[17644]: E0319 11:59:26.582683 17644 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 19 11:59:26.586664 master-0 kubenswrapper[17644]: I0319 11:59:26.586631 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8c94f4649-6ghdm_e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf/openshift-controller-manager-operator/1.log" Mar 19 11:59:26.586824 master-0 kubenswrapper[17644]: I0319 11:59:26.586678 17644 generic.go:334] "Generic (PLEG): container finished" podID="e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf" containerID="fd8c32d22caf0bf1b1f569479b7d959cb1e7f7190abe63f16601f2e5b50a0711" exitCode=255 Mar 19 11:59:26.591932 master-0 kubenswrapper[17644]: I0319 11:59:26.591901 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7bd846bfc4-7fz6w_3c3b0d24-ce5e-49c3-a546-874356f75dc6/network-operator/0.log" Mar 19 11:59:26.591994 master-0 kubenswrapper[17644]: I0319 11:59:26.591937 17644 generic.go:334] "Generic (PLEG): container finished" podID="3c3b0d24-ce5e-49c3-a546-874356f75dc6" containerID="a35a4f30770261f78e16c8cbde80e6ad1d01d59985d717446c5cf700c3ca0a3e" exitCode=255 Mar 19 11:59:26.594829 master-0 kubenswrapper[17644]: I0319 11:59:26.594798 17644 generic.go:334] "Generic (PLEG): container finished" podID="76cf2b01-33d9-47eb-be5d-44946c78bf20" containerID="87cbd1b5cfb2e78754584648c786a0ccf511cf3452d3bed2f55e931cc6e6e1b5" exitCode=0 Mar 19 11:59:26.603013 master-0 kubenswrapper[17644]: I0319 11:59:26.602883 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-qrjj4_163d6a3d-0080-4122-bb7a-17f6e63f00f0/ingress-operator/1.log" Mar 19 11:59:26.603554 master-0 kubenswrapper[17644]: I0319 11:59:26.603500 17644 generic.go:334] "Generic (PLEG): container finished" podID="163d6a3d-0080-4122-bb7a-17f6e63f00f0" containerID="ceffe432bb3380aafe0729954185b3652b99ca21e97ac6c1e688d47217f36148" exitCode=1 Mar 19 11:59:26.607852 master-0 kubenswrapper[17644]: I0319 11:59:26.607802 17644 generic.go:334] "Generic (PLEG): container finished" podID="d06b230b-db67-4afc-8d10-2c33ad568462" containerID="1593c64a217270ac3de7b41e76b88277976a5cada758c58a75da6710a40d48b7" exitCode=0 Mar 19 11:59:26.610895 master-0 kubenswrapper[17644]: I0319 11:59:26.610861 17644 generic.go:334] "Generic (PLEG): container finished" podID="dbcbba74-ac53-4724-a217-4d9b85e7c1db" containerID="b6e56f4e0942ab58cf693081930c0b921d6a49180ecc1e1f47356ba56a945538" exitCode=0 Mar 19 11:59:26.612032 master-0 kubenswrapper[17644]: I0319 11:59:26.612003 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_870e66ff-82ed-4c91-8197-dddcb78048c2/installer/0.log" Mar 19 11:59:26.612032 master-0 kubenswrapper[17644]: I0319 11:59:26.612031 17644 generic.go:334] "Generic (PLEG): container finished" podID="870e66ff-82ed-4c91-8197-dddcb78048c2" containerID="42a335ff2e41047c0beba4d30a5bd16330153a9f1ce92821358c191efd6f3fc9" exitCode=1 Mar 19 11:59:26.617990 master-0 kubenswrapper[17644]: I0319 11:59:26.617935 17644 generic.go:334] "Generic (PLEG): container finished" podID="cf6aab0e-defc-4a4b-8a07-f5af8bf177c4" containerID="0d211d045e5393cc8859f1878115708d215b4f1d3901204a93e27c41a822a537" exitCode=0 Mar 19 11:59:26.617990 master-0 kubenswrapper[17644]: I0319 11:59:26.617981 17644 generic.go:334] "Generic (PLEG): container finished" podID="cf6aab0e-defc-4a4b-8a07-f5af8bf177c4" containerID="394e4d00faf263e34f605868a3854ebf366726976f687b2665ba581d5a0e6077" exitCode=0 Mar 19 11:59:26.621787 master-0 kubenswrapper[17644]: I0319 11:59:26.621680 17644 generic.go:334] "Generic (PLEG): container finished" podID="6bde080b-3820-463f-a27d-9fb9a7843d5d" containerID="bdd2ba95a96b40f792db569b1a38d500c6161c9b6b35b6b22d8099e9a3a35339" exitCode=0 Mar 19 11:59:26.625052 master-0 kubenswrapper[17644]: I0319 11:59:26.625014 17644 generic.go:334] "Generic (PLEG): container finished" podID="e48b5aa9-293e-4222-91ff-7640addeca4c" containerID="0041aa33e170f47251865926ed112bdffedc66315fe41f5f63242817433881b1" exitCode=0 Mar 19 11:59:26.643514 master-0 kubenswrapper[17644]: I0319 11:59:26.643451 17644 generic.go:334] "Generic (PLEG): container finished" podID="00dd3703-af25-4e71-b20b-b3e153383489" containerID="eb9b90eb220564280b7feca2d6bf46f46e6aa93fdc1901db274ac1aaeb65eea5" exitCode=0 Mar 19 11:59:26.643514 master-0 kubenswrapper[17644]: I0319 11:59:26.643486 17644 generic.go:334] "Generic (PLEG): container finished" podID="00dd3703-af25-4e71-b20b-b3e153383489" containerID="14faec67fbcbd3bef3715a945c5a2d9c7ecc242573ef637c229e56ff09166d0d" exitCode=0 Mar 19 11:59:26.646828 master-0 kubenswrapper[17644]: I0319 11:59:26.646795 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-57777556ff-mjwfm_1b94d1eb-1b80-4a14-b1c0-d9e192231352/manager/0.log" Mar 19 11:59:26.646910 master-0 kubenswrapper[17644]: I0319 11:59:26.646839 17644 generic.go:334] "Generic (PLEG): container finished" podID="1b94d1eb-1b80-4a14-b1c0-d9e192231352" containerID="8fcb298ecd66e79f2851c7b4502a7734938f56462fb5de52ed324ec2a3679f14" exitCode=1 Mar 19 11:59:26.653341 master-0 kubenswrapper[17644]: E0319 11:59:26.653012 17644 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:59:26.696756 master-0 kubenswrapper[17644]: I0319 11:59:26.696150 17644 generic.go:334] "Generic (PLEG): container finished" podID="c83737980b9ee109184b1d78e942cf36" containerID="901ed10fec5e9417fcd7522a27f15f9a949e9c0dd2ab8e429fd9b30afd0247bf" exitCode=1 Mar 19 11:59:26.743198 master-0 kubenswrapper[17644]: I0319 11:59:26.742258 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_8e508a43-99db-49eb-bf4e-e3e6a0f49761/installer/0.log" Mar 19 11:59:26.743198 master-0 kubenswrapper[17644]: I0319 11:59:26.742321 17644 generic.go:334] "Generic (PLEG): container finished" podID="8e508a43-99db-49eb-bf4e-e3e6a0f49761" containerID="300261e39c3fe1898b1aa4629252d5e05f336f7f74bdf1250eea81121a460d42" exitCode=1 Mar 19 11:59:26.753205 master-0 kubenswrapper[17644]: E0319 11:59:26.753149 17644 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:59:26.754578 master-0 kubenswrapper[17644]: I0319 11:59:26.754520 17644 generic.go:334] "Generic (PLEG): container finished" podID="3053504d-0734-4def-b639-0f5cc2178185" containerID="3e8362d7d083774070cfab73695a0128d3b617dc47c3ad8cda98be3e5d078943" exitCode=0 Mar 19 11:59:26.758137 master-0 kubenswrapper[17644]: I0319 11:59:26.758057 17644 generic.go:334] "Generic (PLEG): container finished" podID="c13ffb3e-ab50-411c-9208-7ba47e8ebc92" containerID="d0afa60868b67a2bbb33777d6af8334fc696accf5659fb55479d8c7b865f745d" exitCode=0 Mar 19 11:59:26.760765 master-0 kubenswrapper[17644]: I0319 11:59:26.760686 17644 generic.go:334] "Generic (PLEG): container finished" podID="b3de8a1b-a5be-414f-86e8-738e16c8bc97" containerID="ecca7c744f565812652616c950bf4c3ba074defb48c439f60ea10ec59b205e80" exitCode=0 Mar 19 11:59:26.769850 master-0 kubenswrapper[17644]: I0319 11:59:26.769240 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-j528w_8438d015-106b-4aed-ae12-dda781ce51fc/approver/0.log" Mar 19 11:59:26.769850 master-0 kubenswrapper[17644]: I0319 11:59:26.769759 17644 generic.go:334] "Generic (PLEG): container finished" podID="8438d015-106b-4aed-ae12-dda781ce51fc" containerID="27aeacdf42166ebdfe7943145673659894eb1a05c94251adf45a06c9d05c04a8" exitCode=1 Mar 19 11:59:26.782831 master-0 kubenswrapper[17644]: E0319 11:59:26.782777 17644 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 19 11:59:26.790959 master-0 kubenswrapper[17644]: I0319 11:59:26.790838 17644 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="b76fef7000b310af498f0cffcb969b0c47b51465c0a751707ee0c2ff2e63eba3" exitCode=0 Mar 19 11:59:26.790959 master-0 kubenswrapper[17644]: I0319 11:59:26.790889 17644 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="a9417d06413f157e4d35a2d3d830254ff285bb6abccccf700d17496320ba4ec0" exitCode=0 Mar 19 11:59:26.790959 master-0 kubenswrapper[17644]: I0319 11:59:26.790899 17644 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="8abd4bc13ae0709fc6342131dbb0dfd5a762a5ca0945cd22f3346298ea10ec64" exitCode=0 Mar 19 11:59:26.840077 master-0 kubenswrapper[17644]: I0319 11:59:26.840019 17644 generic.go:334] "Generic (PLEG): container finished" podID="cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103" containerID="1e03cf34918a9df69167cf80628d7425b9668e84a411e5ec9a6953baa6d085c1" exitCode=0 Mar 19 11:59:26.840077 master-0 kubenswrapper[17644]: I0319 11:59:26.840060 17644 generic.go:334] "Generic (PLEG): container finished" podID="cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103" containerID="ea1dfb029f622e969a6f7e1a0158a90b12898130ad19366edfc5b84bc32d2910" exitCode=0 Mar 19 11:59:26.842892 master-0 kubenswrapper[17644]: I0319 11:59:26.842858 17644 generic.go:334] "Generic (PLEG): container finished" podID="1c576a88-6da4-43e9-a373-0df27a029f59" containerID="ddc94e7a85827e965bf13353b20a1293018b59883ea4cdbc55de2c9639ca8732" exitCode=0 Mar 19 11:59:26.852174 master-0 kubenswrapper[17644]: I0319 11:59:26.852104 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 19 11:59:26.852664 master-0 kubenswrapper[17644]: I0319 11:59:26.852616 17644 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="d4ec9f8652caf61956bb350585a200ee75b716b204eab89e8110dd9c8c54f2a5" exitCode=1 Mar 19 11:59:26.852664 master-0 kubenswrapper[17644]: I0319 11:59:26.852656 17644 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="5b728a95b5ae31dab98e905315ad7bc4e11c06682ed7961c2f8d666cf463933f" exitCode=0 Mar 19 11:59:26.853393 master-0 kubenswrapper[17644]: E0319 11:59:26.853287 17644 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:59:26.860817 master-0 kubenswrapper[17644]: I0319 11:59:26.858989 17644 generic.go:334] "Generic (PLEG): container finished" podID="b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d" containerID="ffd01994498e412e963b01ac06f0e6ad28082a18471897dde077305cc7888366" exitCode=0 Mar 19 11:59:26.867398 master-0 kubenswrapper[17644]: I0319 11:59:26.867340 17644 generic.go:334] "Generic (PLEG): container finished" podID="e45616db-f7dd-4a08-847f-abf2759d9fa4" containerID="731a4a3fe4e18c18e36f2a5f5b232c45ff2da66d02993af6a4921783cc680289" exitCode=0 Mar 19 11:59:26.870043 master-0 kubenswrapper[17644]: I0319 11:59:26.869658 17644 generic.go:334] "Generic (PLEG): container finished" podID="d1eef757-d63a-4708-8efe-7b27eea1ff63" containerID="80fbd86a4553b76ec244e68f8d481026639e06b5c8ed869b8aace67f7ab378b7" exitCode=0 Mar 19 11:59:26.870043 master-0 kubenswrapper[17644]: I0319 11:59:26.869711 17644 generic.go:334] "Generic (PLEG): container finished" podID="d1eef757-d63a-4708-8efe-7b27eea1ff63" containerID="ec41ae21e96c81776f1b758c8ccd3dc8a175f2bcaaf918fe5635b9c5e4aaf22a" exitCode=0 Mar 19 11:59:26.872635 master-0 kubenswrapper[17644]: I0319 11:59:26.872606 17644 generic.go:334] "Generic (PLEG): container finished" podID="49fac1b46a11e49501805e891baae4a9" containerID="39c756c5e9204811d8c83cfa45ff7447029413f92b87a61b82da1dc41e1a076d" exitCode=0 Mar 19 11:59:26.872635 master-0 kubenswrapper[17644]: I0319 11:59:26.872630 17644 generic.go:334] "Generic (PLEG): container finished" podID="49fac1b46a11e49501805e891baae4a9" containerID="21c17e15f1723f8eb75ec60f42ebd73c793697e640249886764928c881dbaaa1" exitCode=0 Mar 19 11:59:26.872778 master-0 kubenswrapper[17644]: I0319 11:59:26.872641 17644 generic.go:334] "Generic (PLEG): container finished" podID="49fac1b46a11e49501805e891baae4a9" containerID="d3ab6ca62e19d2ef407ccf237743444ad88802357f607cafd2e5c6b8ac29d477" exitCode=0 Mar 19 11:59:26.876028 master-0 kubenswrapper[17644]: I0319 11:59:26.875982 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-6864dc98f7-xzxpq_376b18a9-5f33-44fd-a37b-20ab02c5e65d/manager/0.log" Mar 19 11:59:26.876469 master-0 kubenswrapper[17644]: I0319 11:59:26.876411 17644 generic.go:334] "Generic (PLEG): container finished" podID="376b18a9-5f33-44fd-a37b-20ab02c5e65d" containerID="4c09f5575088b49e0ef7e52a5eb347dfd8470474e6a6ff5bf019908a8d6b87bc" exitCode=1 Mar 19 11:59:26.878500 master-0 kubenswrapper[17644]: I0319 11:59:26.878431 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-95bf4f4d-ng9ss_a3ceeece-bee9-4fcb-8517-95ebce38e223/openshift-config-operator/2.log" Mar 19 11:59:26.878835 master-0 kubenswrapper[17644]: I0319 11:59:26.878801 17644 generic.go:334] "Generic (PLEG): container finished" podID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerID="6face1f4fa1bdc241551919e3ba7726fe31927b0d6796c2e5cb9454a7c5c0bf2" exitCode=255 Mar 19 11:59:26.878835 master-0 kubenswrapper[17644]: I0319 11:59:26.878833 17644 generic.go:334] "Generic (PLEG): container finished" podID="a3ceeece-bee9-4fcb-8517-95ebce38e223" containerID="dd209082a1a57426061cd8939f69f004966e7309cc74fc36f14397708b5c4388" exitCode=0 Mar 19 11:59:26.882989 master-0 kubenswrapper[17644]: I0319 11:59:26.882792 17644 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="a5f86fdf43005285d71f6c8884db1a78fa394b1a17074bd7f8a4187de0fcd0ff" exitCode=0 Mar 19 11:59:26.888744 master-0 kubenswrapper[17644]: I0319 11:59:26.888678 17644 generic.go:334] "Generic (PLEG): container finished" podID="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" containerID="ec3103cf568fabdd9da2c1fe1b486c6e0c444ae0adfa29f7784e8224f29d03a4" exitCode=0 Mar 19 11:59:26.888744 master-0 kubenswrapper[17644]: I0319 11:59:26.888720 17644 generic.go:334] "Generic (PLEG): container finished" podID="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" containerID="a882ec3e14e198707c095bc0bdd34381c81e4c1697293837f13c4fc402ee5b87" exitCode=0 Mar 19 11:59:26.889286 master-0 kubenswrapper[17644]: I0319 11:59:26.888748 17644 generic.go:334] "Generic (PLEG): container finished" podID="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" containerID="e6b2ecdeb98ba4579257a0e7e4159cee8c04ebbb886d532c90b2d6925d5996ab" exitCode=0 Mar 19 11:59:26.889286 master-0 kubenswrapper[17644]: I0319 11:59:26.888763 17644 generic.go:334] "Generic (PLEG): container finished" podID="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" containerID="081b4d6f699ceead2b8cddd228d7b6dc1383135b83134925db54e215e05a85df" exitCode=0 Mar 19 11:59:26.889286 master-0 kubenswrapper[17644]: I0319 11:59:26.888775 17644 generic.go:334] "Generic (PLEG): container finished" podID="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" containerID="ac2545e0b2dd4885511fea2e8cd975f1d1867cae6d7a8bfbf5aa8fba195a8d88" exitCode=0 Mar 19 11:59:26.889286 master-0 kubenswrapper[17644]: I0319 11:59:26.888788 17644 generic.go:334] "Generic (PLEG): container finished" podID="bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a" containerID="de0412fe0521ed4585e79b055942133d1bae28dd08d3cd77acada0e7dc47ebba" exitCode=0 Mar 19 11:59:26.891058 master-0 kubenswrapper[17644]: I0319 11:59:26.891017 17644 generic.go:334] "Generic (PLEG): container finished" podID="66f88242-8b0b-4790-bbb6-445c19b34ee7" containerID="f48ebfe02dc1f93683f1d2eea873f5d0c2c3081e3483e2d09faebd411fa396ef" exitCode=0 Mar 19 11:59:26.896249 master-0 kubenswrapper[17644]: I0319 11:59:26.896204 17644 generic.go:334] "Generic (PLEG): container finished" podID="39d3ac31-9259-454b-8e1c-e23024f8f2b2" containerID="5e2f36e1befc8e73ca3645b7b8f74e7be8e2177e72629e38b72062f0d512ab82" exitCode=0 Mar 19 11:59:26.898028 master-0 kubenswrapper[17644]: I0319 11:59:26.897976 17644 generic.go:334] "Generic (PLEG): container finished" podID="9b61ea14-a7ea-49f3-9df4-5655765ddf7c" containerID="a63fe33504bcc71f9b4e0c9d251065dc432b3176905c1514b755fad213c3ed25" exitCode=0 Mar 19 11:59:26.902773 master-0 kubenswrapper[17644]: I0319 11:59:26.902709 17644 generic.go:334] "Generic (PLEG): container finished" podID="6611e325-6152-480c-9c2c-1b503e49ccd2" containerID="5b56b51126590bf802dd88d10f125adb62528aa19311215ff5bc2461894ca90f" exitCode=0 Mar 19 11:59:26.902895 master-0 kubenswrapper[17644]: I0319 11:59:26.902817 17644 generic.go:334] "Generic (PLEG): container finished" podID="6611e325-6152-480c-9c2c-1b503e49ccd2" containerID="f631f12266f4b047459015dd86cbaf1ce99efc325ca568b49d919857a5c8c1d9" exitCode=0 Mar 19 11:59:26.902895 master-0 kubenswrapper[17644]: I0319 11:59:26.902834 17644 generic.go:334] "Generic (PLEG): container finished" podID="6611e325-6152-480c-9c2c-1b503e49ccd2" containerID="4eed89e87867c4e687c139b0ec5fb8c1e755d1dd5bc8ea8ed4c3c3f5eeb362b4" exitCode=0 Mar 19 11:59:26.908253 master-0 kubenswrapper[17644]: I0319 11:59:26.908197 17644 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="70e30dd45946084b4dbfa27658bf40bdaa54f00c37bf6e48547b5796a6b773e3" exitCode=1 Mar 19 11:59:26.911704 master-0 kubenswrapper[17644]: I0319 11:59:26.911647 17644 generic.go:334] "Generic (PLEG): container finished" podID="daf4dbb6-5a0a-4c92-a930-479a7330ace1" containerID="1961370e7c6f3b39c50205c4d3f694632a63f87701e4b9c1a6a05e005ec065b1" exitCode=0 Mar 19 11:59:26.953438 master-0 kubenswrapper[17644]: E0319 11:59:26.953369 17644 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:59:27.015679 master-0 kubenswrapper[17644]: I0319 11:59:27.015629 17644 manager.go:324] Recovery completed Mar 19 11:59:27.057783 master-0 kubenswrapper[17644]: E0319 11:59:27.053807 17644 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:59:27.140789 master-0 kubenswrapper[17644]: I0319 11:59:27.140607 17644 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:59:27.143814 master-0 kubenswrapper[17644]: I0319 11:59:27.143764 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:59:27.143908 master-0 kubenswrapper[17644]: I0319 11:59:27.143832 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:59:27.143908 master-0 kubenswrapper[17644]: I0319 11:59:27.143845 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:59:27.148821 master-0 kubenswrapper[17644]: I0319 11:59:27.148795 17644 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 19 11:59:27.148899 master-0 kubenswrapper[17644]: I0319 11:59:27.148886 17644 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 19 11:59:27.148977 master-0 kubenswrapper[17644]: I0319 11:59:27.148967 17644 state_mem.go:36] "Initialized new in-memory state store" Mar 19 11:59:27.149359 master-0 kubenswrapper[17644]: I0319 11:59:27.149316 17644 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 19 11:59:27.149494 master-0 kubenswrapper[17644]: I0319 11:59:27.149422 17644 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 19 11:59:27.149558 master-0 kubenswrapper[17644]: I0319 11:59:27.149548 17644 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Mar 19 11:59:27.149606 master-0 kubenswrapper[17644]: I0319 11:59:27.149598 17644 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Mar 19 11:59:27.149682 master-0 kubenswrapper[17644]: I0319 11:59:27.149674 17644 policy_none.go:49] "None policy: Start" Mar 19 11:59:27.154846 master-0 kubenswrapper[17644]: E0319 11:59:27.154783 17644 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:59:27.155089 master-0 kubenswrapper[17644]: I0319 11:59:27.155033 17644 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 19 11:59:27.155140 master-0 kubenswrapper[17644]: I0319 11:59:27.155122 17644 state_mem.go:35] "Initializing new in-memory state store" Mar 19 11:59:27.155607 master-0 kubenswrapper[17644]: I0319 11:59:27.155567 17644 state_mem.go:75] "Updated machine memory state" Mar 19 11:59:27.155607 master-0 kubenswrapper[17644]: I0319 11:59:27.155591 17644 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Mar 19 11:59:27.173931 master-0 kubenswrapper[17644]: I0319 11:59:27.173857 17644 manager.go:334] "Starting Device Plugin manager" Mar 19 11:59:27.173931 master-0 kubenswrapper[17644]: I0319 11:59:27.173953 17644 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 19 11:59:27.173931 master-0 kubenswrapper[17644]: I0319 11:59:27.173971 17644 server.go:79] "Starting device plugin registration server" Mar 19 11:59:27.175052 master-0 kubenswrapper[17644]: I0319 11:59:27.174959 17644 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 19 11:59:27.175052 master-0 kubenswrapper[17644]: I0319 11:59:27.174988 17644 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 19 11:59:27.175233 master-0 kubenswrapper[17644]: I0319 11:59:27.175126 17644 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 19 11:59:27.175233 master-0 kubenswrapper[17644]: I0319 11:59:27.175233 17644 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 19 11:59:27.175340 master-0 kubenswrapper[17644]: I0319 11:59:27.175245 17644 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 19 11:59:27.183283 master-0 kubenswrapper[17644]: I0319 11:59:27.183152 17644 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0","openshift-kube-apiserver/kube-apiserver-master-0","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0"] Mar 19 11:59:27.183543 master-0 kubenswrapper[17644]: I0319 11:59:27.183327 17644 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:59:27.187100 master-0 kubenswrapper[17644]: I0319 11:59:27.187032 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:59:27.187100 master-0 kubenswrapper[17644]: I0319 11:59:27.187092 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:59:27.187100 master-0 kubenswrapper[17644]: I0319 11:59:27.187108 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:59:27.187453 master-0 kubenswrapper[17644]: I0319 11:59:27.187271 17644 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:59:27.187453 master-0 kubenswrapper[17644]: I0319 11:59:27.187444 17644 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:59:27.188662 master-0 kubenswrapper[17644]: E0319 11:59:27.188561 17644 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 11:59:27.193017 master-0 kubenswrapper[17644]: I0319 11:59:27.191928 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:59:27.193017 master-0 kubenswrapper[17644]: I0319 11:59:27.191967 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:59:27.193017 master-0 kubenswrapper[17644]: I0319 11:59:27.191990 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:59:27.193017 master-0 kubenswrapper[17644]: I0319 11:59:27.191990 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:59:27.193017 master-0 kubenswrapper[17644]: I0319 11:59:27.192032 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:59:27.193017 master-0 kubenswrapper[17644]: I0319 11:59:27.192051 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:59:27.193017 master-0 kubenswrapper[17644]: I0319 11:59:27.192293 17644 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:59:27.193017 master-0 kubenswrapper[17644]: I0319 11:59:27.192531 17644 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:59:27.196658 master-0 kubenswrapper[17644]: I0319 11:59:27.196578 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:59:27.196658 master-0 kubenswrapper[17644]: I0319 11:59:27.196650 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:59:27.196808 master-0 kubenswrapper[17644]: I0319 11:59:27.196666 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:59:27.197839 master-0 kubenswrapper[17644]: I0319 11:59:27.197794 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:59:27.197839 master-0 kubenswrapper[17644]: I0319 11:59:27.197830 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:59:27.197970 master-0 kubenswrapper[17644]: I0319 11:59:27.197852 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:59:27.198157 master-0 kubenswrapper[17644]: I0319 11:59:27.198117 17644 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:59:27.198429 master-0 kubenswrapper[17644]: I0319 11:59:27.198388 17644 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:59:27.202091 master-0 kubenswrapper[17644]: I0319 11:59:27.202044 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:59:27.202091 master-0 kubenswrapper[17644]: I0319 11:59:27.202087 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:59:27.202224 master-0 kubenswrapper[17644]: I0319 11:59:27.202106 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:59:27.202224 master-0 kubenswrapper[17644]: I0319 11:59:27.202155 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:59:27.202224 master-0 kubenswrapper[17644]: I0319 11:59:27.202190 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:59:27.202224 master-0 kubenswrapper[17644]: I0319 11:59:27.202202 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:59:27.202413 master-0 kubenswrapper[17644]: I0319 11:59:27.202391 17644 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:59:27.202858 master-0 kubenswrapper[17644]: I0319 11:59:27.202682 17644 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:59:27.204530 master-0 kubenswrapper[17644]: I0319 11:59:27.204504 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:59:27.204530 master-0 kubenswrapper[17644]: I0319 11:59:27.204535 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:59:27.204748 master-0 kubenswrapper[17644]: I0319 11:59:27.204548 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:59:27.204748 master-0 kubenswrapper[17644]: I0319 11:59:27.204699 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:59:27.204748 master-0 kubenswrapper[17644]: I0319 11:59:27.204710 17644 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:59:27.204861 master-0 kubenswrapper[17644]: I0319 11:59:27.204719 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:59:27.204861 master-0 kubenswrapper[17644]: I0319 11:59:27.204824 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:59:27.204861 master-0 kubenswrapper[17644]: I0319 11:59:27.204840 17644 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:59:27.207608 master-0 kubenswrapper[17644]: I0319 11:59:27.207568 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:59:27.207608 master-0 kubenswrapper[17644]: I0319 11:59:27.207590 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:59:27.207608 master-0 kubenswrapper[17644]: I0319 11:59:27.207599 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:59:27.208047 master-0 kubenswrapper[17644]: I0319 11:59:27.208015 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:59:27.208047 master-0 kubenswrapper[17644]: I0319 11:59:27.208047 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:59:27.208149 master-0 kubenswrapper[17644]: I0319 11:59:27.208062 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:59:27.208324 master-0 kubenswrapper[17644]: I0319 11:59:27.208303 17644 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:59:27.208371 master-0 kubenswrapper[17644]: I0319 11:59:27.208332 17644 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e775e2237b3e13100c3e1ab188e2c83cffcee4d11a252841dd57b5b92e5e9841" Mar 19 11:59:27.208432 master-0 kubenswrapper[17644]: I0319 11:59:27.208358 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"8e7a82869988463543d3d8dd1f0b5fe3","Type":"ContainerStarted","Data":"9b6257f6f7778f2a35b2ae1acf1c50824333ce2495da482e1b0a8f990b61871a"} Mar 19 11:59:27.208474 master-0 kubenswrapper[17644]: I0319 11:59:27.208438 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"8e7a82869988463543d3d8dd1f0b5fe3","Type":"ContainerStarted","Data":"c238dcb10339e469e019f35f43263a486da7ad20431c7557165dd244d72db205"} Mar 19 11:59:27.208504 master-0 kubenswrapper[17644]: I0319 11:59:27.208473 17644 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89d6b9652bfd68fb0b68a832373fa141222adae111524f0fd223064e1824cd6a" Mar 19 11:59:27.208570 master-0 kubenswrapper[17644]: I0319 11:59:27.208547 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"a24c957c2955f33fcac616e1dace18be5248f20b6e9d2c791c70c17f3df96825"} Mar 19 11:59:27.208616 master-0 kubenswrapper[17644]: I0319 11:59:27.208584 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerDied","Data":"901ed10fec5e9417fcd7522a27f15f9a949e9c0dd2ab8e429fd9b30afd0247bf"} Mar 19 11:59:27.208656 master-0 kubenswrapper[17644]: I0319 11:59:27.208625 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"3ca871a2e4c187593092b1e6a4a9637d7435e4628b01bcadfea7c6a9560eeb21"} Mar 19 11:59:27.208656 master-0 kubenswrapper[17644]: I0319 11:59:27.208652 17644 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f8ce6740e029884a042c5c80751b9fc216004e3a3d167dbdc2e5cb2a86f8183" Mar 19 11:59:27.208719 master-0 kubenswrapper[17644]: I0319 11:59:27.208685 17644 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19faef5336e0e62090140de4619f79a9e64f33712b5b8e70590e04d8b85ea93f" Mar 19 11:59:27.208775 master-0 kubenswrapper[17644]: I0319 11:59:27.208753 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"c2bb91041db17b87be85528b31c2480989756f9c7e485dd2cc9a4b6bbe2f021b"} Mar 19 11:59:27.208806 master-0 kubenswrapper[17644]: I0319 11:59:27.208772 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"b342656179e33f18902581a908ce540ce6ef0dc91604b6d131a3f77e2a7348cf"} Mar 19 11:59:27.208806 master-0 kubenswrapper[17644]: I0319 11:59:27.208788 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"4aeac864a0b8c139910b7cc56c7cd968bf6d24973d0e32d571eccc06d033d0f5"} Mar 19 11:59:27.208806 master-0 kubenswrapper[17644]: I0319 11:59:27.208800 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"0250f49b8d891954793ad552b261b0ce750c83c05e6b10b449eb9f6c02bf16f9"} Mar 19 11:59:27.208888 master-0 kubenswrapper[17644]: I0319 11:59:27.208815 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"06671c3758623dfa519c5bba4e475806636df7ef1dd7182a02cae6c91baa2e46"} Mar 19 11:59:27.208888 master-0 kubenswrapper[17644]: I0319 11:59:27.208828 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"b76fef7000b310af498f0cffcb969b0c47b51465c0a751707ee0c2ff2e63eba3"} Mar 19 11:59:27.208888 master-0 kubenswrapper[17644]: I0319 11:59:27.208842 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"a9417d06413f157e4d35a2d3d830254ff285bb6abccccf700d17496320ba4ec0"} Mar 19 11:59:27.208888 master-0 kubenswrapper[17644]: I0319 11:59:27.208855 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"8abd4bc13ae0709fc6342131dbb0dfd5a762a5ca0945cd22f3346298ea10ec64"} Mar 19 11:59:27.208888 master-0 kubenswrapper[17644]: I0319 11:59:27.208867 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"9a2616ea0257b4942755b9e9fb23bb4dfd3518f40e9ffe96a9ef4230caaa00fe"} Mar 19 11:59:27.209049 master-0 kubenswrapper[17644]: I0319 11:59:27.208911 17644 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6afce628b759b4a9bfac575d71074779271063662545462b264e568ed7ab2d8" Mar 19 11:59:27.209049 master-0 kubenswrapper[17644]: I0319 11:59:27.208928 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"0252eb9b3a6c4d52db4e7759da29168fb6757ff67b4995374ebfa16c86b93541"} Mar 19 11:59:27.209049 master-0 kubenswrapper[17644]: I0319 11:59:27.208943 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"d4ec9f8652caf61956bb350585a200ee75b716b204eab89e8110dd9c8c54f2a5"} Mar 19 11:59:27.209049 master-0 kubenswrapper[17644]: I0319 11:59:27.208957 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"5b728a95b5ae31dab98e905315ad7bc4e11c06682ed7961c2f8d666cf463933f"} Mar 19 11:59:27.209049 master-0 kubenswrapper[17644]: I0319 11:59:27.208970 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"c7fce19a33a5dd46ce06e3ec2001f8aae0d2c521be7c2647e59448b0833408c9"} Mar 19 11:59:27.209049 master-0 kubenswrapper[17644]: I0319 11:59:27.209019 17644 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc43d8901b61b03f0bff74bea5349f358784d720e2984f56ccc961dc3f630856" Mar 19 11:59:27.209049 master-0 kubenswrapper[17644]: I0319 11:59:27.209050 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"3012b2963902713916d9cd34e1392325e6497b856aefe8cee37b525fe08e7328"} Mar 19 11:59:27.209249 master-0 kubenswrapper[17644]: I0319 11:59:27.209068 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"f96d175f0cd36aaf469f89a28d2a0993ca551b8d590ac0d9e0b11a56d879ec29"} Mar 19 11:59:27.209249 master-0 kubenswrapper[17644]: I0319 11:59:27.209083 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"f9eb29fd4fd09864a6d14d4e4d10b2022ffb83f13ece47bbceaba5b7bd3c3dd4"} Mar 19 11:59:27.209249 master-0 kubenswrapper[17644]: I0319 11:59:27.209095 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"2a13852b86c512a96024f86ef51091c77de4129071d7f100f1b56772f75c4778"} Mar 19 11:59:27.209249 master-0 kubenswrapper[17644]: I0319 11:59:27.209106 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"991feed2291ac9a84bafc878cdc07f7aa3c0c5e50e56fe23c94905ee545d3fbd"} Mar 19 11:59:27.209249 master-0 kubenswrapper[17644]: I0319 11:59:27.209119 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerDied","Data":"a5f86fdf43005285d71f6c8884db1a78fa394b1a17074bd7f8a4187de0fcd0ff"} Mar 19 11:59:27.209249 master-0 kubenswrapper[17644]: I0319 11:59:27.209132 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"e78919d3ec5c9e1fc04085900a692953e2087a6d624466d667eb24bc45d8ddb6"} Mar 19 11:59:27.209249 master-0 kubenswrapper[17644]: I0319 11:59:27.209198 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"b533d36029413fb01ef12a682704ad486041204246c172de95f0a4aeff2f5180"} Mar 19 11:59:27.209481 master-0 kubenswrapper[17644]: I0319 11:59:27.209269 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerDied","Data":"70e30dd45946084b4dbfa27658bf40bdaa54f00c37bf6e48547b5796a6b773e3"} Mar 19 11:59:27.209481 master-0 kubenswrapper[17644]: I0319 11:59:27.209285 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"6ecb192a1cfeb4529102ad33aeed1229502ac0d4a0688a01c8e90bffa6cdc39c"} Mar 19 11:59:27.209481 master-0 kubenswrapper[17644]: I0319 11:59:27.209297 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"c21d5cdcf33dc5445d398db5efae2e61668498b313fd2a8200f2011b9857d1d4"} Mar 19 11:59:27.209821 master-0 kubenswrapper[17644]: I0319 11:59:27.209794 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:59:27.209878 master-0 kubenswrapper[17644]: I0319 11:59:27.209825 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:59:27.209878 master-0 kubenswrapper[17644]: I0319 11:59:27.209839 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:59:27.275314 master-0 kubenswrapper[17644]: I0319 11:59:27.275232 17644 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:59:27.278395 master-0 kubenswrapper[17644]: I0319 11:59:27.278357 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:59:27.278460 master-0 kubenswrapper[17644]: I0319 11:59:27.278398 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:59:27.278460 master-0 kubenswrapper[17644]: I0319 11:59:27.278437 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:59:27.278460 master-0 kubenswrapper[17644]: I0319 11:59:27.278458 17644 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:59:27.283062 master-0 kubenswrapper[17644]: E0319 11:59:27.282991 17644 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"master-0\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="master-0" Mar 19 11:59:27.483552 master-0 kubenswrapper[17644]: I0319 11:59:27.483445 17644 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:59:27.486013 master-0 kubenswrapper[17644]: I0319 11:59:27.485964 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:59:27.486013 master-0 kubenswrapper[17644]: I0319 11:59:27.486008 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:59:27.486013 master-0 kubenswrapper[17644]: I0319 11:59:27.486019 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:59:27.486354 master-0 kubenswrapper[17644]: I0319 11:59:27.486039 17644 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:59:27.489522 master-0 kubenswrapper[17644]: E0319 11:59:27.489468 17644 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"master-0\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="master-0" Mar 19 11:59:27.890841 master-0 kubenswrapper[17644]: I0319 11:59:27.890648 17644 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:59:27.893774 master-0 kubenswrapper[17644]: I0319 11:59:27.893713 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:59:27.893861 master-0 kubenswrapper[17644]: I0319 11:59:27.893795 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:59:27.893861 master-0 kubenswrapper[17644]: I0319 11:59:27.893806 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:59:27.893861 master-0 kubenswrapper[17644]: I0319 11:59:27.893830 17644 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:59:27.897941 master-0 kubenswrapper[17644]: E0319 11:59:27.897905 17644 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"master-0\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="master-0" Mar 19 11:59:28.276677 master-0 kubenswrapper[17644]: E0319 11:59:28.276598 17644 resource_metrics.go:161] "Error getting summary for resourceMetric prometheus endpoint" err="failed to get node info: node \"master-0\" not found" Mar 19 11:59:28.698920 master-0 kubenswrapper[17644]: I0319 11:59:28.698822 17644 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:59:28.703004 master-0 kubenswrapper[17644]: I0319 11:59:28.702962 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:59:28.703093 master-0 kubenswrapper[17644]: I0319 11:59:28.703013 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:59:28.703093 master-0 kubenswrapper[17644]: I0319 11:59:28.703024 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:59:28.703093 master-0 kubenswrapper[17644]: I0319 11:59:28.703049 17644 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:59:28.707435 master-0 kubenswrapper[17644]: E0319 11:59:28.707388 17644 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"master-0\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="master-0" Mar 19 11:59:30.307659 master-0 kubenswrapper[17644]: I0319 11:59:30.307543 17644 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:59:30.311329 master-0 kubenswrapper[17644]: I0319 11:59:30.311285 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:59:30.311408 master-0 kubenswrapper[17644]: I0319 11:59:30.311344 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:59:30.311408 master-0 kubenswrapper[17644]: I0319 11:59:30.311363 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:59:30.311408 master-0 kubenswrapper[17644]: I0319 11:59:30.311394 17644 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:59:30.315497 master-0 kubenswrapper[17644]: E0319 11:59:30.315456 17644 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"master-0\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="master-0" Mar 19 11:59:31.434771 master-0 kubenswrapper[17644]: I0319 11:59:31.434654 17644 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 11:59:31.435461 master-0 kubenswrapper[17644]: I0319 11:59:31.435110 17644 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 11:59:31.453556 master-0 kubenswrapper[17644]: I0319 11:59:31.453494 17644 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 11:59:31.473517 master-0 kubenswrapper[17644]: I0319 11:59:31.473447 17644 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 19 11:59:31.490244 master-0 kubenswrapper[17644]: I0319 11:59:31.490051 17644 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 11:59:31.574642 master-0 kubenswrapper[17644]: I0319 11:59:31.574562 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:59:31.574642 master-0 kubenswrapper[17644]: I0319 11:59:31.574614 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:59:31.575040 master-0 kubenswrapper[17644]: I0319 11:59:31.574718 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:59:31.575040 master-0 kubenswrapper[17644]: I0319 11:59:31.574783 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:31.575040 master-0 kubenswrapper[17644]: I0319 11:59:31.574809 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:31.575040 master-0 kubenswrapper[17644]: I0319 11:59:31.574829 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:59:31.575040 master-0 kubenswrapper[17644]: I0319 11:59:31.574849 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:59:31.575040 master-0 kubenswrapper[17644]: I0319 11:59:31.574867 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:59:31.575040 master-0 kubenswrapper[17644]: I0319 11:59:31.574907 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:31.575040 master-0 kubenswrapper[17644]: I0319 11:59:31.574962 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:59:31.575040 master-0 kubenswrapper[17644]: I0319 11:59:31.574991 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:59:31.575477 master-0 kubenswrapper[17644]: I0319 11:59:31.575160 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:59:31.575477 master-0 kubenswrapper[17644]: I0319 11:59:31.575253 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:31.575477 master-0 kubenswrapper[17644]: I0319 11:59:31.575282 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:31.575477 master-0 kubenswrapper[17644]: I0319 11:59:31.575306 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:31.575477 master-0 kubenswrapper[17644]: I0319 11:59:31.575328 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:31.575477 master-0 kubenswrapper[17644]: I0319 11:59:31.575351 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:31.575477 master-0 kubenswrapper[17644]: I0319 11:59:31.575386 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:59:31.575477 master-0 kubenswrapper[17644]: I0319 11:59:31.575453 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:59:31.575792 master-0 kubenswrapper[17644]: I0319 11:59:31.575509 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:59:31.575792 master-0 kubenswrapper[17644]: I0319 11:59:31.575536 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:59:31.575792 master-0 kubenswrapper[17644]: I0319 11:59:31.575562 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:59:31.575792 master-0 kubenswrapper[17644]: I0319 11:59:31.575584 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:59:31.676340 master-0 kubenswrapper[17644]: I0319 11:59:31.676271 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:59:31.676340 master-0 kubenswrapper[17644]: I0319 11:59:31.676341 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:59:31.676682 master-0 kubenswrapper[17644]: I0319 11:59:31.676411 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:59:31.676682 master-0 kubenswrapper[17644]: I0319 11:59:31.676451 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:59:31.676682 master-0 kubenswrapper[17644]: I0319 11:59:31.676483 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:31.676682 master-0 kubenswrapper[17644]: I0319 11:59:31.676510 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:31.676682 master-0 kubenswrapper[17644]: I0319 11:59:31.676643 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:59:31.676941 master-0 kubenswrapper[17644]: I0319 11:59:31.676899 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:59:31.677002 master-0 kubenswrapper[17644]: I0319 11:59:31.676975 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:59:31.677087 master-0 kubenswrapper[17644]: I0319 11:59:31.677049 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:59:31.677125 master-0 kubenswrapper[17644]: I0319 11:59:31.677093 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:59:31.677178 master-0 kubenswrapper[17644]: I0319 11:59:31.677155 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:59:31.677224 master-0 kubenswrapper[17644]: I0319 11:59:31.677188 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:31.677224 master-0 kubenswrapper[17644]: I0319 11:59:31.677212 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:31.677303 master-0 kubenswrapper[17644]: I0319 11:59:31.677235 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:31.677303 master-0 kubenswrapper[17644]: I0319 11:59:31.677264 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:59:31.677303 master-0 kubenswrapper[17644]: I0319 11:59:31.677293 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:59:31.677396 master-0 kubenswrapper[17644]: I0319 11:59:31.677314 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:59:31.677396 master-0 kubenswrapper[17644]: I0319 11:59:31.677336 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:31.677396 master-0 kubenswrapper[17644]: I0319 11:59:31.677357 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:31.677396 master-0 kubenswrapper[17644]: I0319 11:59:31.677386 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:31.677514 master-0 kubenswrapper[17644]: I0319 11:59:31.677410 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:59:31.677514 master-0 kubenswrapper[17644]: I0319 11:59:31.677434 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:59:31.677514 master-0 kubenswrapper[17644]: I0319 11:59:31.677455 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:59:31.677514 master-0 kubenswrapper[17644]: I0319 11:59:31.677478 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:59:31.677514 master-0 kubenswrapper[17644]: I0319 11:59:31.677499 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:59:31.677659 master-0 kubenswrapper[17644]: I0319 11:59:31.677120 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:31.677659 master-0 kubenswrapper[17644]: I0319 11:59:31.677536 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:59:31.678497 master-0 kubenswrapper[17644]: I0319 11:59:31.677090 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:59:31.678650 master-0 kubenswrapper[17644]: I0319 11:59:31.678590 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:31.678884 master-0 kubenswrapper[17644]: I0319 11:59:31.678843 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:59:31.679075 master-0 kubenswrapper[17644]: I0319 11:59:31.678995 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:59:31.679130 master-0 kubenswrapper[17644]: I0319 11:59:31.679072 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:59:31.679130 master-0 kubenswrapper[17644]: I0319 11:59:31.679091 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:59:31.679130 master-0 kubenswrapper[17644]: I0319 11:59:31.679109 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:59:31.679230 master-0 kubenswrapper[17644]: I0319 11:59:31.679136 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:59:31.679230 master-0 kubenswrapper[17644]: I0319 11:59:31.679145 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:59:31.679230 master-0 kubenswrapper[17644]: I0319 11:59:31.679164 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:31.679230 master-0 kubenswrapper[17644]: I0319 11:59:31.679173 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:31.679230 master-0 kubenswrapper[17644]: I0319 11:59:31.679190 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:59:31.679230 master-0 kubenswrapper[17644]: I0319 11:59:31.679212 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:31.679409 master-0 kubenswrapper[17644]: I0319 11:59:31.679268 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:31.679409 master-0 kubenswrapper[17644]: I0319 11:59:31.679342 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:31.679409 master-0 kubenswrapper[17644]: I0319 11:59:31.679401 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:59:31.679505 master-0 kubenswrapper[17644]: I0319 11:59:31.679446 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:59:31.679868 master-0 kubenswrapper[17644]: I0319 11:59:31.679842 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:31.701470 master-0 kubenswrapper[17644]: I0319 11:59:31.701415 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 19 11:59:31.704003 master-0 kubenswrapper[17644]: I0319 11:59:31.703915 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:31.704157 master-0 kubenswrapper[17644]: I0319 11:59:31.704120 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:31.723358 master-0 kubenswrapper[17644]: I0319 11:59:31.719482 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 19 11:59:31.723358 master-0 kubenswrapper[17644]: I0319 11:59:31.723258 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:32.159129 master-0 kubenswrapper[17644]: I0319 11:59:32.159083 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b45ea2ef1cf2bc9d1d994d6538ae0a64/kube-apiserver-check-endpoints/0.log" Mar 19 11:59:32.164714 master-0 kubenswrapper[17644]: I0319 11:59:32.163847 17644 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="3012b2963902713916d9cd34e1392325e6497b856aefe8cee37b525fe08e7328" exitCode=255 Mar 19 11:59:32.164714 master-0 kubenswrapper[17644]: I0319 11:59:32.164510 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerDied","Data":"3012b2963902713916d9cd34e1392325e6497b856aefe8cee37b525fe08e7328"} Mar 19 11:59:32.240388 master-0 kubenswrapper[17644]: E0319 11:59:32.240093 17644 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:59:32.247462 master-0 kubenswrapper[17644]: I0319 11:59:32.246049 17644 scope.go:117] "RemoveContainer" containerID="3012b2963902713916d9cd34e1392325e6497b856aefe8cee37b525fe08e7328" Mar 19 11:59:32.248123 master-0 kubenswrapper[17644]: E0319 11:59:32.247545 17644 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-startup-monitor-master-0\" already exists" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:32.254658 master-0 kubenswrapper[17644]: E0319 11:59:32.253999 17644 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" already exists" pod="openshift-etcd/etcd-master-0" Mar 19 11:59:32.258407 master-0 kubenswrapper[17644]: E0319 11:59:32.256092 17644 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:59:32.326875 master-0 kubenswrapper[17644]: I0319 11:59:32.318470 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:59:32.344756 master-0 kubenswrapper[17644]: I0319 11:59:32.334993 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:59:32.421985 master-0 kubenswrapper[17644]: I0319 11:59:32.421830 17644 apiserver.go:52] "Watching apiserver" Mar 19 11:59:32.444316 master-0 kubenswrapper[17644]: I0319 11:59:32.443875 17644 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 11:59:32.450927 master-0 kubenswrapper[17644]: I0319 11:59:32.450831 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn","openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2","openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6","openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr","openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm","openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9","openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86","openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-htdhf","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n","openshift-kube-scheduler/installer-3-master-0","openshift-multus/network-metrics-daemon-f6wv7","openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm","openshift-marketplace/marketplace-operator-89ccd998f-bftt4","openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn","openshift-dns-operator/dns-operator-9c5679d8f-965np","openshift-dns/node-resolver-pm77f","openshift-kube-apiserver/installer-1-master-0","openshift-kube-apiserver/kube-apiserver-master-0","openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv","openshift-multus/multus-additional-cni-plugins-n8vwk","openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-ssxxd","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh","openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd","openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-machine-config-operator/machine-config-server-ltk8s","openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9","openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq","openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h","openshift-marketplace/redhat-operators-w2fqh","openshift-network-node-identity/network-node-identity-j528w","openshift-network-operator/iptables-alerter-n52gc","assisted-installer/assisted-installer-controller-48bcp","kube-system/bootstrap-kube-controller-manager-master-0","openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls","openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w","kube-system/bootstrap-kube-scheduler-master-0","openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8","openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss","openshift-machine-config-operator/machine-config-daemon-mgzld","openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f","openshift-network-diagnostics/network-check-target-cr8n7","openshift-ovn-kubernetes/ovnkube-node-4qxkd","openshift-apiserver/apiserver-f67f6868b-chx8j","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5","openshift-etcd/etcd-master-0","openshift-kube-apiserver/installer-3-master-0","openshift-dns/dns-default-ztgjs","openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4","openshift-marketplace/certified-operators-gwt6h","openshift-multus/multus-552pc","openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq","openshift-marketplace/redhat-marketplace-ccbc5","openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94","openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-fx8ng","openshift-etcd/installer-1-master-0","openshift-ingress/router-default-7dcf5569b5-kpmgt","openshift-insights/insights-operator-68bf6ff9d6-djfg8","openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f","openshift-marketplace/community-operators-h668l","openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh","openshift-monitoring/node-exporter-pnb9m","openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-89rdt","openshift-network-operator/network-operator-7bd846bfc4-7fz6w","openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd","openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk","openshift-kube-storage-version-migrator/migrator-8487694857-jls48","openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd","openshift-service-ca/service-ca-79bc6b8d76-lzfbh","openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-j7rc9","openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5","openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz","openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq","openshift-cluster-node-tuning-operator/tuned-x6mmm","openshift-controller-manager/controller-manager-548bb99f44-txbjj","openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6","openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9","openshift-network-diagnostics/network-check-source-b4bf74f6-llsdf","openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5"] Mar 19 11:59:32.451624 master-0 kubenswrapper[17644]: I0319 11:59:32.451329 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-48bcp" Mar 19 11:59:32.455809 master-0 kubenswrapper[17644]: I0319 11:59:32.454141 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 11:59:32.455809 master-0 kubenswrapper[17644]: I0319 11:59:32.455186 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 11:59:32.459995 master-0 kubenswrapper[17644]: I0319 11:59:32.459611 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 11:59:32.460296 master-0 kubenswrapper[17644]: I0319 11:59:32.460057 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 11:59:32.460296 master-0 kubenswrapper[17644]: I0319 11:59:32.460263 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 19 11:59:32.461342 master-0 kubenswrapper[17644]: I0319 11:59:32.461303 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 19 11:59:32.461510 master-0 kubenswrapper[17644]: I0319 11:59:32.461501 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 19 11:59:32.461718 master-0 kubenswrapper[17644]: I0319 11:59:32.461701 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 11:59:32.469592 master-0 kubenswrapper[17644]: I0319 11:59:32.469542 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 11:59:32.469844 master-0 kubenswrapper[17644]: I0319 11:59:32.469815 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 11:59:32.469948 master-0 kubenswrapper[17644]: I0319 11:59:32.469815 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 11:59:32.470077 master-0 kubenswrapper[17644]: I0319 11:59:32.470038 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 11:59:32.470380 master-0 kubenswrapper[17644]: I0319 11:59:32.470343 17644 kubelet.go:2566] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="b761f07c-53be-436b-b05e-17a554cb94ed" Mar 19 11:59:32.473918 master-0 kubenswrapper[17644]: I0319 11:59:32.473873 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 19 11:59:32.481820 master-0 kubenswrapper[17644]: I0319 11:59:32.481712 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:59:32.485386 master-0 kubenswrapper[17644]: I0319 11:59:32.485335 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:59:32.503201 master-0 kubenswrapper[17644]: I0319 11:59:32.503155 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:59:32.506117 master-0 kubenswrapper[17644]: I0319 11:59:32.504984 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 19 11:59:32.507969 master-0 kubenswrapper[17644]: I0319 11:59:32.507937 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 19 11:59:32.508190 master-0 kubenswrapper[17644]: I0319 11:59:32.508124 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 11:59:32.508302 master-0 kubenswrapper[17644]: I0319 11:59:32.508279 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 11:59:32.508396 master-0 kubenswrapper[17644]: I0319 11:59:32.508358 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 11:59:32.508464 master-0 kubenswrapper[17644]: I0319 11:59:32.508448 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 11:59:32.508709 master-0 kubenswrapper[17644]: I0319 11:59:32.508558 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 19 11:59:32.524071 master-0 kubenswrapper[17644]: I0319 11:59:32.523999 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 11:59:32.524482 master-0 kubenswrapper[17644]: I0319 11:59:32.524277 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 19 11:59:32.525062 master-0 kubenswrapper[17644]: I0319 11:59:32.524406 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 11:59:32.525382 master-0 kubenswrapper[17644]: I0319 11:59:32.525341 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 11:59:32.525447 master-0 kubenswrapper[17644]: I0319 11:59:32.525388 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 11:59:32.525596 master-0 kubenswrapper[17644]: I0319 11:59:32.525567 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 11:59:32.525778 master-0 kubenswrapper[17644]: I0319 11:59:32.525688 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 11:59:32.526011 master-0 kubenswrapper[17644]: I0319 11:59:32.525939 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 11:59:32.526048 master-0 kubenswrapper[17644]: I0319 11:59:32.526009 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 19 11:59:32.526213 master-0 kubenswrapper[17644]: I0319 11:59:32.526081 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 11:59:32.526213 master-0 kubenswrapper[17644]: I0319 11:59:32.526088 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 11:59:32.526303 master-0 kubenswrapper[17644]: I0319 11:59:32.526236 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 11:59:32.526537 master-0 kubenswrapper[17644]: I0319 11:59:32.526418 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 11:59:32.526654 master-0 kubenswrapper[17644]: I0319 11:59:32.526628 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 11:59:32.526689 master-0 kubenswrapper[17644]: I0319 11:59:32.526663 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 11:59:32.526807 master-0 kubenswrapper[17644]: I0319 11:59:32.526784 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 11:59:32.526854 master-0 kubenswrapper[17644]: I0319 11:59:32.526825 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 11:59:32.526854 master-0 kubenswrapper[17644]: I0319 11:59:32.526842 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 11:59:32.526916 master-0 kubenswrapper[17644]: I0319 11:59:32.526891 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 11:59:32.527018 master-0 kubenswrapper[17644]: I0319 11:59:32.527001 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 11:59:32.527105 master-0 kubenswrapper[17644]: I0319 11:59:32.527071 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 11:59:32.527611 master-0 kubenswrapper[17644]: I0319 11:59:32.527268 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 11:59:32.527611 master-0 kubenswrapper[17644]: I0319 11:59:32.527316 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 11:59:32.527672 master-0 kubenswrapper[17644]: I0319 11:59:32.527651 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 11:59:32.528259 master-0 kubenswrapper[17644]: I0319 11:59:32.527837 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 11:59:32.528259 master-0 kubenswrapper[17644]: I0319 11:59:32.528257 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 11:59:32.528961 master-0 kubenswrapper[17644]: I0319 11:59:32.528941 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 11:59:32.530407 master-0 kubenswrapper[17644]: I0319 11:59:32.530383 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 11:59:32.531793 master-0 kubenswrapper[17644]: I0319 11:59:32.531142 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 19 11:59:32.531793 master-0 kubenswrapper[17644]: I0319 11:59:32.531167 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 11:59:32.531793 master-0 kubenswrapper[17644]: I0319 11:59:32.531257 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 19 11:59:32.531793 master-0 kubenswrapper[17644]: I0319 11:59:32.531280 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 11:59:32.531793 master-0 kubenswrapper[17644]: I0319 11:59:32.531344 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 19 11:59:32.531793 master-0 kubenswrapper[17644]: I0319 11:59:32.531373 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 11:59:32.531793 master-0 kubenswrapper[17644]: I0319 11:59:32.531482 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 11:59:32.531793 master-0 kubenswrapper[17644]: I0319 11:59:32.531642 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 11:59:32.531793 master-0 kubenswrapper[17644]: I0319 11:59:32.531678 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 19 11:59:32.532147 master-0 kubenswrapper[17644]: I0319 11:59:32.531810 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 11:59:32.532147 master-0 kubenswrapper[17644]: I0319 11:59:32.532109 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 11:59:32.532250 master-0 kubenswrapper[17644]: I0319 11:59:32.532236 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 11:59:32.532365 master-0 kubenswrapper[17644]: I0319 11:59:32.532333 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 11:59:32.532695 master-0 kubenswrapper[17644]: I0319 11:59:32.532670 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 11:59:32.532739 master-0 kubenswrapper[17644]: I0319 11:59:32.532711 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 11:59:32.532848 master-0 kubenswrapper[17644]: I0319 11:59:32.532827 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 11:59:32.532943 master-0 kubenswrapper[17644]: I0319 11:59:32.532916 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 11:59:32.533020 master-0 kubenswrapper[17644]: I0319 11:59:32.532996 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 11:59:32.533096 master-0 kubenswrapper[17644]: I0319 11:59:32.532116 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 11:59:32.533238 master-0 kubenswrapper[17644]: I0319 11:59:32.533212 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 11:59:32.533313 master-0 kubenswrapper[17644]: I0319 11:59:32.533300 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 19 11:59:32.533347 master-0 kubenswrapper[17644]: I0319 11:59:32.533333 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 11:59:32.533430 master-0 kubenswrapper[17644]: I0319 11:59:32.533083 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 11:59:32.533466 master-0 kubenswrapper[17644]: I0319 11:59:32.533441 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 11:59:32.533635 master-0 kubenswrapper[17644]: I0319 11:59:32.533613 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 11:59:32.534020 master-0 kubenswrapper[17644]: I0319 11:59:32.533814 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 11:59:32.534020 master-0 kubenswrapper[17644]: I0319 11:59:32.533839 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 11:59:32.534280 master-0 kubenswrapper[17644]: I0319 11:59:32.534260 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 11:59:32.534416 master-0 kubenswrapper[17644]: I0319 11:59:32.534356 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 11:59:32.534494 master-0 kubenswrapper[17644]: I0319 11:59:32.534472 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 11:59:32.534564 master-0 kubenswrapper[17644]: I0319 11:59:32.534546 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 11:59:32.535225 master-0 kubenswrapper[17644]: I0319 11:59:32.534846 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 11:59:32.535225 master-0 kubenswrapper[17644]: I0319 11:59:32.534865 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 11:59:32.535225 master-0 kubenswrapper[17644]: I0319 11:59:32.535023 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 11:59:32.535225 master-0 kubenswrapper[17644]: I0319 11:59:32.535030 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 11:59:32.535225 master-0 kubenswrapper[17644]: I0319 11:59:32.535100 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 11:59:32.535225 master-0 kubenswrapper[17644]: I0319 11:59:32.535173 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 11:59:32.535450 master-0 kubenswrapper[17644]: I0319 11:59:32.535383 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 11:59:32.535689 master-0 kubenswrapper[17644]: I0319 11:59:32.535667 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 11:59:32.536875 master-0 kubenswrapper[17644]: I0319 11:59:32.536853 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 11:59:32.537374 master-0 kubenswrapper[17644]: I0319 11:59:32.537355 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 11:59:32.543848 master-0 kubenswrapper[17644]: I0319 11:59:32.541079 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 11:59:32.544272 master-0 kubenswrapper[17644]: I0319 11:59:32.544136 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 11:59:32.547772 master-0 kubenswrapper[17644]: I0319 11:59:32.544429 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 11:59:32.566870 master-0 kubenswrapper[17644]: I0319 11:59:32.544522 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 11:59:32.566870 master-0 kubenswrapper[17644]: I0319 11:59:32.544503 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 11:59:32.566870 master-0 kubenswrapper[17644]: I0319 11:59:32.546463 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 11:59:32.566870 master-0 kubenswrapper[17644]: I0319 11:59:32.547690 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 11:59:32.567530 master-0 kubenswrapper[17644]: I0319 11:59:32.567086 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 11:59:32.568593 master-0 kubenswrapper[17644]: I0319 11:59:32.567849 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 11:59:32.572758 master-0 kubenswrapper[17644]: I0319 11:59:32.571677 17644 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 19 11:59:32.573766 master-0 kubenswrapper[17644]: I0319 11:59:32.573280 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 19 11:59:32.576771 master-0 kubenswrapper[17644]: I0319 11:59:32.574432 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 11:59:32.576771 master-0 kubenswrapper[17644]: I0319 11:59:32.575494 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 11:59:32.578736 master-0 kubenswrapper[17644]: I0319 11:59:32.577848 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 11:59:32.578928 master-0 kubenswrapper[17644]: I0319 11:59:32.578894 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 11:59:32.578980 master-0 kubenswrapper[17644]: I0319 11:59:32.578939 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 19 11:59:32.580866 master-0 kubenswrapper[17644]: I0319 11:59:32.580015 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 11:59:32.594578 master-0 kubenswrapper[17644]: I0319 11:59:32.593912 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595297 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7a51eeaf-1349-4bf3-932d-22ed5ce7c161-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6f97756bc8-j7rc9\" (UID: \"7a51eeaf-1349-4bf3-932d-22ed5ce7c161\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-j7rc9" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595336 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrgqb\" (UniqueName: \"kubernetes.io/projected/a3ceeece-bee9-4fcb-8517-95ebce38e223-kube-api-access-zrgqb\") pod \"openshift-config-operator-95bf4f4d-ng9ss\" (UID: \"a3ceeece-bee9-4fcb-8517-95ebce38e223\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595355 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac09dba7-398c-4b0a-a415-edb73cb4cf30-cert\") pod \"cluster-autoscaler-operator-866dc4744-dnx7f\" (UID: \"ac09dba7-398c-4b0a-a415-edb73cb4cf30\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595376 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-metrics-client-ca\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595396 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd6rv\" (UniqueName: \"kubernetes.io/projected/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-kube-api-access-dd6rv\") pod \"machine-api-operator-6fbb6cf6f9-jf7p6\" (UID: \"75aedbcd-f6ed-43a1-941b-4b04887ffe8e\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595413 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb22a965-9b36-40cd-993d-747a3978be8e-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-ssxxd\" (UID: \"bb22a965-9b36-40cd-993d-747a3978be8e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-ssxxd" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595434 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj527\" (UniqueName: \"kubernetes.io/projected/76cf2b01-33d9-47eb-be5d-44946c78bf20-kube-api-access-nj527\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595452 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8376e1f9-ab05-42d4-aa66-284a167a9bfc-tmp\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595471 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595489 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkm97\" (UniqueName: \"kubernetes.io/projected/cf08ab4f-c203-4c16-9826-8cc049f4af31-kube-api-access-lkm97\") pod \"catalog-operator-68f85b4d6c-n5gr9\" (UID: \"cf08ab4f-c203-4c16-9826-8cc049f4af31\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595507 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a3ceeece-bee9-4fcb-8517-95ebce38e223-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-ng9ss\" (UID: \"a3ceeece-bee9-4fcb-8517-95ebce38e223\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595529 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-run-netns\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595548 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trcb7\" (UniqueName: \"kubernetes.io/projected/e2ad29ad-70ef-43c6-91f6-02f04d145673-kube-api-access-trcb7\") pod \"router-default-7dcf5569b5-kpmgt\" (UID: \"e2ad29ad-70ef-43c6-91f6-02f04d145673\") " pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595568 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w\" (UID: \"b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595599 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/376b18a9-5f33-44fd-a37b-20ab02c5e65d-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595618 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-config\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595635 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d06b230b-db67-4afc-8d10-2c33ad568462-metrics-client-ca\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595653 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-sys\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595677 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3c3b0d24-ce5e-49c3-a546-874356f75dc6-host-etc-kube\") pod \"network-operator-7bd846bfc4-7fz6w\" (UID: \"3c3b0d24-ce5e-49c3-a546-874356f75dc6\") " pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595694 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dd6ec279-d92f-45c2-97c2-88b96fbd6600-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595713 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qql5t\" (UniqueName: \"kubernetes.io/projected/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-kube-api-access-qql5t\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w\" (UID: \"b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595752 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgs4l\" (UniqueName: \"kubernetes.io/projected/f29b11ce-60e0-46b3-8d28-eea3452513cd-kube-api-access-bgs4l\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595770 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/376b18a9-5f33-44fd-a37b-20ab02c5e65d-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595789 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5rm4\" (UniqueName: \"kubernetes.io/projected/e5078f17-bc65-460f-9f18-8c506db6840b-kube-api-access-s5rm4\") pod \"package-server-manager-7b95f86987-jq5vq\" (UID: \"e5078f17-bc65-460f-9f18-8c506db6840b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595808 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bb2x\" (UniqueName: \"kubernetes.io/projected/3053504d-0734-4def-b639-0f5cc2178185-kube-api-access-2bb2x\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595827 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034cad93-a500-4c58-8d97-fa49866a0d5e-serving-cert\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595845 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12809811-c9df-4e77-8c12-309831b8975d-proxy-tls\") pod \"machine-config-controller-b4f87c5b9-lg6h9\" (UID: \"12809811-c9df-4e77-8c12-309831b8975d\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595863 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595882 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-textfile\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595900 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595918 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-modprobe-d\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595936 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdbjk\" (UniqueName: \"kubernetes.io/projected/f3b6a8b5-bcaa-47f6-a9d5-6186981191d5-kube-api-access-jdbjk\") pod \"migrator-8487694857-jls48\" (UID: \"f3b6a8b5-bcaa-47f6-a9d5-6186981191d5\") " pod="openshift-kube-storage-version-migrator/migrator-8487694857-jls48" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595952 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-kubelet\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595971 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-config\") pod \"openshift-kube-scheduler-operator-dddff6458-4wj9n\" (UID: \"9b61ea14-a7ea-49f3-9df4-5655765ddf7c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.595992 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/daf4dbb6-5a0a-4c92-a930-479a7330ace1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596010 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-conf-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596038 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-trusted-ca-bundle\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596055 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-wdwkz\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596071 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-jf7p6\" (UID: \"75aedbcd-f6ed-43a1-941b-4b04887ffe8e\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596091 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-sysctl-conf\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596112 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd6ec279-d92f-45c2-97c2-88b96fbd6600-serving-cert\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596134 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbcbba74-ac53-4724-a217-4d9b85e7c1db-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-5gvgh\" (UID: \"dbcbba74-ac53-4724-a217-4d9b85e7c1db\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596150 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6d41245b-33d4-40f8-bbe1-6d2247e2e335-tmpfs\") pod \"packageserver-bbf67c86c-n58nq\" (UID: \"6d41245b-33d4-40f8-bbe1-6d2247e2e335\") " pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596166 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d06b230b-db67-4afc-8d10-2c33ad568462-sys\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596181 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-sysconfig\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596197 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-config\") pod \"service-ca-operator-b865698dc-md7m5\" (UID: \"f5d73fef-1414-4b29-97ea-42e1c0b1ef18\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596213 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cni-binary-copy\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596231 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd6ec279-d92f-45c2-97c2-88b96fbd6600-kube-api-access\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596269 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1b94d1eb-1b80-4a14-b1c0-d9e192231352-cache\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596291 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e45616db-f7dd-4a08-847f-abf2759d9fa4-audit-policies\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596310 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-cnibin\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596337 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c-config-volume\") pod \"dns-default-ztgjs\" (UID: \"1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c\") " pod="openshift-dns/dns-default-ztgjs" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596355 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nfnb\" (UniqueName: \"kubernetes.io/projected/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-kube-api-access-7nfnb\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596375 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596393 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbcbba74-ac53-4724-a217-4d9b85e7c1db-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-5gvgh\" (UID: \"dbcbba74-ac53-4724-a217-4d9b85e7c1db\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596413 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e401a4-ed2f-46f7-924b-329d7b313e6a-config\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596434 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596453 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92e401a4-ed2f-46f7-924b-329d7b313e6a-cert\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596472 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79qrb\" (UniqueName: \"kubernetes.io/projected/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-kube-api-access-79qrb\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596490 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d41245b-33d4-40f8-bbe1-6d2247e2e335-apiservice-cert\") pod \"packageserver-bbf67c86c-n58nq\" (UID: \"6d41245b-33d4-40f8-bbe1-6d2247e2e335\") " pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596509 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-jq5vq\" (UID: \"e5078f17-bc65-460f-9f18-8c506db6840b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596527 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-var-lib-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596574 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8438d015-106b-4aed-ae12-dda781ce51fc-webhook-cert\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596592 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbq7n\" (UniqueName: \"kubernetes.io/projected/d1eef757-d63a-4708-8efe-7b27eea1ff63-kube-api-access-kbq7n\") pod \"community-operators-h668l\" (UID: \"d1eef757-d63a-4708-8efe-7b27eea1ff63\") " pod="openshift-marketplace/community-operators-h668l" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596608 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bbtl\" (UniqueName: \"kubernetes.io/projected/d06b230b-db67-4afc-8d10-2c33ad568462-kube-api-access-4bbtl\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596625 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f4aad0ff-e6cd-4c43-9561-80a14fee4712-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596642 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-4wj9n\" (UID: \"9b61ea14-a7ea-49f3-9df4-5655765ddf7c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596659 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pngsr\" (UniqueName: \"kubernetes.io/projected/3c3b0d24-ce5e-49c3-a546-874356f75dc6-kube-api-access-pngsr\") pod \"network-operator-7bd846bfc4-7fz6w\" (UID: \"3c3b0d24-ce5e-49c3-a546-874356f75dc6\") " pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596680 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3c3b0d24-ce5e-49c3-a546-874356f75dc6-metrics-tls\") pod \"network-operator-7bd846bfc4-7fz6w\" (UID: \"3c3b0d24-ce5e-49c3-a546-874356f75dc6\") " pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596698 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-etc-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596717 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596890 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8bmw\" (UniqueName: \"kubernetes.io/projected/716c2176-50f9-4c4f-af0e-4c7973457df2-kube-api-access-m8bmw\") pod \"olm-operator-5c9796789-l9sw9\" (UID: \"716c2176-50f9-4c4f-af0e-4c7973457df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596911 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596935 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e45616db-f7dd-4a08-847f-abf2759d9fa4-encryption-config\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596953 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-server-tls\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596974 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5c7eb66-e23e-40df-883c-fed012c07f26-proxy-tls\") pod \"machine-config-operator-84d549f6d5-66wvv\" (UID: \"b5c7eb66-e23e-40df-883c-fed012c07f26\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.596994 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-metrics-client-ca\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597011 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597029 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/39d3ac31-9259-454b-8e1c-e23024f8f2b2-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-pjc7h\" (UID: \"39d3ac31-9259-454b-8e1c-e23024f8f2b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597048 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-kubelet\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597066 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-client\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597085 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfvz6\" (UniqueName: \"kubernetes.io/projected/732989c5-1b89-46f0-9917-b68613f7f005-kube-api-access-bfvz6\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597102 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/daf4dbb6-5a0a-4c92-a930-479a7330ace1-env-overrides\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597119 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7784\" (UniqueName: \"kubernetes.io/projected/8376e1f9-ab05-42d4-aa66-284a167a9bfc-kube-api-access-n7784\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597137 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597153 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-run\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597170 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls\") pod \"dns-operator-9c5679d8f-965np\" (UID: \"22e10648-af7c-409e-b947-570e7d807e05\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597189 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbhv4\" (UniqueName: \"kubernetes.io/projected/ac09dba7-398c-4b0a-a415-edb73cb4cf30-kube-api-access-pbhv4\") pod \"cluster-autoscaler-operator-866dc4744-dnx7f\" (UID: \"ac09dba7-398c-4b0a-a415-edb73cb4cf30\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597209 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2hrw\" (UniqueName: \"kubernetes.io/projected/376b18a9-5f33-44fd-a37b-20ab02c5e65d-kube-api-access-f2hrw\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597232 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66f88242-8b0b-4790-bbb6-445c19b34ee7-config\") pod \"openshift-apiserver-operator-d65958b8-6hsqn\" (UID: \"66f88242-8b0b-4790-bbb6-445c19b34ee7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597251 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt6bf\" (UniqueName: \"kubernetes.io/projected/1c898657-f06b-44ab-95ff-53a324759ba1-kube-api-access-mt6bf\") pod \"node-resolver-pm77f\" (UID: \"1c898657-f06b-44ab-95ff-53a324759ba1\") " pod="openshift-dns/node-resolver-pm77f" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597270 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-image-import-ca\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597288 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-config\") pod \"route-controller-manager-864f875b6b-rcjvd\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597308 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rt57\" (UniqueName: \"kubernetes.io/projected/2292109e-92a9-4286-858e-dcd2ac083c43-kube-api-access-8rt57\") pod \"csi-snapshot-controller-operator-5f5d689c6b-fx8ng\" (UID: \"2292109e-92a9-4286-858e-dcd2ac083c43\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-fx8ng" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597329 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1b94d1eb-1b80-4a14-b1c0-d9e192231352-ca-certs\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597350 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-config\") pod \"machine-api-operator-6fbb6cf6f9-jf7p6\" (UID: \"75aedbcd-f6ed-43a1-941b-4b04887ffe8e\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597370 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/6611e325-6152-480c-9c2c-1b503e49ccd2-operand-assets\") pod \"cluster-olm-operator-67dcd4998-rgbzk\" (UID: \"6611e325-6152-480c-9c2c-1b503e49ccd2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597389 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88ghj\" (UniqueName: \"kubernetes.io/projected/e48b5aa9-293e-4222-91ff-7640addeca4c-kube-api-access-88ghj\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597409 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx487\" (UniqueName: \"kubernetes.io/projected/b5c7eb66-e23e-40df-883c-fed012c07f26-kube-api-access-tx487\") pod \"machine-config-operator-84d549f6d5-66wvv\" (UID: \"b5c7eb66-e23e-40df-883c-fed012c07f26\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597429 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-ovn\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597445 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1c898657-f06b-44ab-95ff-53a324759ba1-hosts-file\") pod \"node-resolver-pm77f\" (UID: \"1c898657-f06b-44ab-95ff-53a324759ba1\") " pod="openshift-dns/node-resolver-pm77f" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597462 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c-metrics-tls\") pod \"dns-default-ztgjs\" (UID: \"1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c\") " pod="openshift-dns/dns-default-ztgjs" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597480 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e48b5aa9-293e-4222-91ff-7640addeca4c-serving-cert\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597543 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-images\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597564 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-tls\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597590 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-os-release\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597607 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1b94d1eb-1b80-4a14-b1c0-d9e192231352-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597626 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-cni-multus\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597644 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-tuned\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597663 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lscpq\" (UniqueName: \"kubernetes.io/projected/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-kube-api-access-lscpq\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597685 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/5f8c022c-7871-4765-971f-dcafa39357c9-audit-log\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597703 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-system-cni-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597723 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/52bdf7cc-f07d-487e-937c-6567f194947e-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-htdhf\" (UID: \"52bdf7cc-f07d-487e-937c-6567f194947e\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-htdhf" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597759 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-var-lib-kubelet\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597798 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7spvn\" (UniqueName: \"kubernetes.io/projected/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-kube-api-access-7spvn\") pod \"multus-admission-controller-5dbbb8b86f-wdwkz\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597818 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597837 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e48b5aa9-293e-4222-91ff-7640addeca4c-etcd-client\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597858 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-client-certs\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597878 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597895 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-images\") pod \"machine-api-operator-6fbb6cf6f9-jf7p6\" (UID: \"75aedbcd-f6ed-43a1-941b-4b04887ffe8e\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597913 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6870ccc7-2094-48d8-9238-f486a4b8d5af-certs\") pod \"machine-config-server-ltk8s\" (UID: \"6870ccc7-2094-48d8-9238-f486a4b8d5af\") " pod="openshift-machine-config-operator/machine-config-server-ltk8s" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597933 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2ad29ad-70ef-43c6-91f6-02f04d145673-service-ca-bundle\") pod \"router-default-7dcf5569b5-kpmgt\" (UID: \"e2ad29ad-70ef-43c6-91f6-02f04d145673\") " pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597952 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-proxy-ca-bundles\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597970 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1b94d1eb-1b80-4a14-b1c0-d9e192231352-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.597988 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-etcd-serving-ca\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.598006 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-4wj9n\" (UID: \"9b61ea14-a7ea-49f3-9df4-5655765ddf7c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" Mar 19 11:59:32.601270 master-0 kubenswrapper[17644]: I0319 11:59:32.598024 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/8438d015-106b-4aed-ae12-dda781ce51fc-ovnkube-identity-cm\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598044 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24f71770-714e-4111-9188-ad8663c6baa7-proxy-tls\") pod \"machine-config-daemon-mgzld\" (UID: \"24f71770-714e-4111-9188-ad8663c6baa7\") " pod="openshift-machine-config-operator/machine-config-daemon-mgzld" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598062 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert\") pod \"catalog-operator-68f85b4d6c-n5gr9\" (UID: \"cf08ab4f-c203-4c16-9826-8cc049f4af31\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598080 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1eef757-d63a-4708-8efe-7b27eea1ff63-utilities\") pod \"community-operators-h668l\" (UID: \"d1eef757-d63a-4708-8efe-7b27eea1ff63\") " pod="openshift-marketplace/community-operators-h668l" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598098 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d06b230b-db67-4afc-8d10-2c33ad568462-root\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598115 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-k8s-cni-cncf-io\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598132 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-cni-netd\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598151 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39d3ac31-9259-454b-8e1c-e23024f8f2b2-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-pjc7h\" (UID: \"39d3ac31-9259-454b-8e1c-e23024f8f2b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598169 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7nhq\" (UniqueName: \"kubernetes.io/projected/92e401a4-ed2f-46f7-924b-329d7b313e6a-kube-api-access-c7nhq\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598187 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00dd3703-af25-4e71-b20b-b3e153383489-utilities\") pod \"certified-operators-gwt6h\" (UID: \"00dd3703-af25-4e71-b20b-b3e153383489\") " pod="openshift-marketplace/certified-operators-gwt6h" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598211 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-lib-modules\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598235 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fvvj\" (UniqueName: \"kubernetes.io/projected/e65e2a2f-16b5-44a3-9860-741f70188ab5-kube-api-access-4fvvj\") pod \"network-check-source-b4bf74f6-llsdf\" (UID: \"e65e2a2f-16b5-44a3-9860-741f70188ab5\") " pod="openshift-network-diagnostics/network-check-source-b4bf74f6-llsdf" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598254 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b5c7eb66-e23e-40df-883c-fed012c07f26-images\") pod \"machine-config-operator-84d549f6d5-66wvv\" (UID: \"b5c7eb66-e23e-40df-883c-fed012c07f26\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598273 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-client-ca\") pod \"route-controller-manager-864f875b6b-rcjvd\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598291 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w\" (UID: \"b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598310 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdx6s\" (UniqueName: \"kubernetes.io/projected/12809811-c9df-4e77-8c12-309831b8975d-kube-api-access-bdx6s\") pod \"machine-config-controller-b4f87c5b9-lg6h9\" (UID: \"12809811-c9df-4e77-8c12-309831b8975d\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598328 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-cni-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598344 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-etc-kubernetes\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598363 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-config\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598382 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-894bt\" (UniqueName: \"kubernetes.io/projected/cf6aab0e-defc-4a4b-8a07-f5af8bf177c4-kube-api-access-894bt\") pod \"redhat-marketplace-ccbc5\" (UID: \"cf6aab0e-defc-4a4b-8a07-f5af8bf177c4\") " pod="openshift-marketplace/redhat-marketplace-ccbc5" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598399 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-env-overrides\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598416 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598433 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-systemd\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598450 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dg9r\" (UniqueName: \"kubernetes.io/projected/6870ccc7-2094-48d8-9238-f486a4b8d5af-kube-api-access-9dg9r\") pod \"machine-config-server-ltk8s\" (UID: \"6870ccc7-2094-48d8-9238-f486a4b8d5af\") " pod="openshift-machine-config-operator/machine-config-server-ltk8s" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598469 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598489 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m287x\" (UniqueName: \"kubernetes.io/projected/24f71770-714e-4111-9188-ad8663c6baa7-kube-api-access-m287x\") pod \"machine-config-daemon-mgzld\" (UID: \"24f71770-714e-4111-9188-ad8663c6baa7\") " pod="openshift-machine-config-operator/machine-config-daemon-mgzld" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598508 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr788\" (UniqueName: \"kubernetes.io/projected/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-kube-api-access-dr788\") pod \"machine-approver-5c6485487f-5zvc5\" (UID: \"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598527 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g997b\" (UniqueName: \"kubernetes.io/projected/5f8c022c-7871-4765-971f-dcafa39357c9-kube-api-access-g997b\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598544 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e2ad29ad-70ef-43c6-91f6-02f04d145673-default-certificate\") pod \"router-default-7dcf5569b5-kpmgt\" (UID: \"e2ad29ad-70ef-43c6-91f6-02f04d145673\") " pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598562 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-client-ca\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.598578 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-serving-cert\") pod \"service-ca-operator-b865698dc-md7m5\" (UID: \"f5d73fef-1414-4b29-97ea-42e1c0b1ef18\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599311 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/24f71770-714e-4111-9188-ad8663c6baa7-mcd-auth-proxy-config\") pod \"machine-config-daemon-mgzld\" (UID: \"24f71770-714e-4111-9188-ad8663c6baa7\") " pod="openshift-machine-config-operator/machine-config-daemon-mgzld" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599330 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-iptables-alerter-script\") pod \"iptables-alerter-n52gc\" (UID: \"4e2c195f-e97d-4cac-81fc-2d5c551d1c30\") " pod="openshift-network-operator/iptables-alerter-n52gc" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599348 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-cni-bin\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599365 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-multus-certs\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599385 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf6aab0e-defc-4a4b-8a07-f5af8bf177c4-catalog-content\") pod \"redhat-marketplace-ccbc5\" (UID: \"cf6aab0e-defc-4a4b-8a07-f5af8bf177c4\") " pod="openshift-marketplace/redhat-marketplace-ccbc5" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599402 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-volume-directive-shadow\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599419 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-6ghdm\" (UID: \"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599435 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599452 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access\") pod \"installer-3-master-0\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599468 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-kubernetes\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599485 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-system-cni-dir\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599503 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599525 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-systemd-units\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599542 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3053504d-0734-4def-b639-0f5cc2178185-ovn-node-metrics-cert\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599559 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66f88242-8b0b-4790-bbb6-445c19b34ee7-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-6hsqn\" (UID: \"66f88242-8b0b-4790-bbb6-445c19b34ee7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599579 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/09a22c25-6073-4b1a-a029-928452ef37db-multus-daemon-config\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599595 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/034cad93-a500-4c58-8d97-fa49866a0d5e-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599612 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/732989c5-1b89-46f0-9917-b68613f7f005-serving-cert\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599630 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvkxx\" (UniqueName: \"kubernetes.io/projected/e45616db-f7dd-4a08-847f-abf2759d9fa4-kube-api-access-dvkxx\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599647 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg6sp\" (UniqueName: \"kubernetes.io/projected/cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103-kube-api-access-hg6sp\") pod \"redhat-operators-w2fqh\" (UID: \"cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103\") " pod="openshift-marketplace/redhat-operators-w2fqh" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599664 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-slash\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599683 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzrh8\" (UniqueName: \"kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8\") pod \"network-check-target-cr8n7\" (UID: \"6230ed8f-4608-4168-8f5a-656f411b0ef7\") " pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599701 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5fnx\" (UniqueName: \"kubernetes.io/projected/66f88242-8b0b-4790-bbb6-445c19b34ee7-kube-api-access-p5fnx\") pod \"openshift-apiserver-operator-d65958b8-6hsqn\" (UID: \"66f88242-8b0b-4790-bbb6-445c19b34ee7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599717 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-socket-dir-parent\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599752 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqr6w\" (UniqueName: \"kubernetes.io/projected/8438d015-106b-4aed-ae12-dda781ce51fc-kube-api-access-cqr6w\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599770 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599790 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76cf2b01-33d9-47eb-be5d-44946c78bf20-serving-cert\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599807 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlr9q\" (UniqueName: \"kubernetes.io/projected/b3de8a1b-a5be-414f-86e8-738e16c8bc97-kube-api-access-nlr9q\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599827 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-serving-cert\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599845 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-sysctl-d\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599863 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72jlb\" (UniqueName: \"kubernetes.io/projected/daf4dbb6-5a0a-4c92-a930-479a7330ace1-kube-api-access-72jlb\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599880 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/92e401a4-ed2f-46f7-924b-329d7b313e6a-images\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599898 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/12809811-c9df-4e77-8c12-309831b8975d-mcc-auth-proxy-config\") pod \"machine-config-controller-b4f87c5b9-lg6h9\" (UID: \"12809811-c9df-4e77-8c12-309831b8975d\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599916 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599934 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd6ec279-d92f-45c2-97c2-88b96fbd6600-service-ca\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599951 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-config\") pod \"machine-approver-5c6485487f-5zvc5\" (UID: \"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599968 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-metrics-server-audit-profiles\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.599986 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-host-slash\") pod \"iptables-alerter-n52gc\" (UID: \"4e2c195f-e97d-4cac-81fc-2d5c551d1c30\") " pod="openshift-network-operator/iptables-alerter-n52gc" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600003 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600021 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/034cad93-a500-4c58-8d97-fa49866a0d5e-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600040 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00dd3703-af25-4e71-b20b-b3e153383489-catalog-content\") pod \"certified-operators-gwt6h\" (UID: \"00dd3703-af25-4e71-b20b-b3e153383489\") " pod="openshift-marketplace/certified-operators-gwt6h" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600057 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-wtmp\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600076 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v9bx\" (UniqueName: \"kubernetes.io/projected/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab-kube-api-access-8v9bx\") pod \"cloud-credential-operator-744f9dbf77-mhvls\" (UID: \"0cbbe8d0-aafb-499f-a1f4-affcea62c1ab\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600095 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7bq7\" (UniqueName: \"kubernetes.io/projected/6d41245b-33d4-40f8-bbe1-6d2247e2e335-kube-api-access-k7bq7\") pod \"packageserver-bbf67c86c-n58nq\" (UID: \"6d41245b-33d4-40f8-bbe1-6d2247e2e335\") " pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600118 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/163d6a3d-0080-4122-bb7a-17f6e63f00f0-bound-sa-token\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600134 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-log-socket\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600175 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d41245b-33d4-40f8-bbe1-6d2247e2e335-webhook-cert\") pod \"packageserver-bbf67c86c-n58nq\" (UID: \"6d41245b-33d4-40f8-bbe1-6d2247e2e335\") " pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600195 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wls49\" (UniqueName: \"kubernetes.io/projected/22e10648-af7c-409e-b947-570e7d807e05-kube-api-access-wls49\") pod \"dns-operator-9c5679d8f-965np\" (UID: \"22e10648-af7c-409e-b947-570e7d807e05\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600212 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103-utilities\") pod \"redhat-operators-w2fqh\" (UID: \"cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103\") " pod="openshift-marketplace/redhat-operators-w2fqh" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600230 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-config\") pod \"openshift-controller-manager-operator-8c94f4649-6ghdm\" (UID: \"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600249 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9778f8f5-b0d1-4967-9776-9db758bba3af-tls-certificates\") pod \"prometheus-operator-admission-webhook-69c6b55594-89rdt\" (UID: \"9778f8f5-b0d1-4967-9776-9db758bba3af\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-89rdt" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600291 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e45616db-f7dd-4a08-847f-abf2759d9fa4-serving-cert\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600309 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-ca\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600326 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e48b5aa9-293e-4222-91ff-7640addeca4c-audit-dir\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600343 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3ceeece-bee9-4fcb-8517-95ebce38e223-serving-cert\") pod \"openshift-config-operator-95bf4f4d-ng9ss\" (UID: \"a3ceeece-bee9-4fcb-8517-95ebce38e223\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600362 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ac09dba7-398c-4b0a-a415-edb73cb4cf30-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-dnx7f\" (UID: \"ac09dba7-398c-4b0a-a415-edb73cb4cf30\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600381 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/92e401a4-ed2f-46f7-924b-329d7b313e6a-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600397 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-config\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600415 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2ad29ad-70ef-43c6-91f6-02f04d145673-metrics-certs\") pod \"router-default-7dcf5569b5-kpmgt\" (UID: \"e2ad29ad-70ef-43c6-91f6-02f04d145673\") " pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600433 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-mhvls\" (UID: \"0cbbe8d0-aafb-499f-a1f4-affcea62c1ab\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600451 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-audit\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600470 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-machine-approver-tls\") pod \"machine-approver-5c6485487f-5zvc5\" (UID: \"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600488 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwrd5\" (UniqueName: \"kubernetes.io/projected/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-api-access-kwrd5\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600508 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7tc5\" (UniqueName: \"kubernetes.io/projected/163d6a3d-0080-4122-bb7a-17f6e63f00f0-kube-api-access-m7tc5\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600570 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/6611e325-6152-480c-9c2c-1b503e49ccd2-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-rgbzk\" (UID: \"6611e325-6152-480c-9c2c-1b503e49ccd2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600593 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zndqq\" (UniqueName: \"kubernetes.io/projected/f4aad0ff-e6cd-4c43-9561-80a14fee4712-kube-api-access-zndqq\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600611 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mxjl\" (UniqueName: \"kubernetes.io/projected/1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c-kube-api-access-2mxjl\") pod \"dns-default-ztgjs\" (UID: \"1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c\") " pod="openshift-dns/dns-default-ztgjs" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600650 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-auth-proxy-config\") pod \"machine-approver-5c6485487f-5zvc5\" (UID: \"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600670 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-run-ovn-kubernetes\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600687 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-hostroot\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600706 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600791 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600829 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-os-release\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600849 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4aad0ff-e6cd-4c43-9561-80a14fee4712-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600867 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f4aad0ff-e6cd-4c43-9561-80a14fee4712-metrics-client-ca\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600907 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dd6ec279-d92f-45c2-97c2-88b96fbd6600-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600933 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39d3ac31-9259-454b-8e1c-e23024f8f2b2-config\") pod \"kube-apiserver-operator-8b68b9d9b-pjc7h\" (UID: \"39d3ac31-9259-454b-8e1c-e23024f8f2b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600953 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfxw7\" (UniqueName: \"kubernetes.io/projected/7a51eeaf-1349-4bf3-932d-22ed5ce7c161-kube-api-access-cfxw7\") pod \"control-plane-machine-set-operator-6f97756bc8-j7rc9\" (UID: \"7a51eeaf-1349-4bf3-932d-22ed5ce7c161\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-j7rc9" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.600994 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/376b18a9-5f33-44fd-a37b-20ab02c5e65d-ca-certs\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.601013 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5th4l\" (UniqueName: \"kubernetes.io/projected/6e76fc3f-39a4-4f99-8603-38a94da6ea8e-kube-api-access-5th4l\") pod \"service-ca-79bc6b8d76-lzfbh\" (UID: \"6e76fc3f-39a4-4f99-8603-38a94da6ea8e\") " pod="openshift-service-ca/service-ca-79bc6b8d76-lzfbh" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.601033 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5skx\" (UniqueName: \"kubernetes.io/projected/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-kube-api-access-n5skx\") pod \"route-controller-manager-864f875b6b-rcjvd\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.601075 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbcbba74-ac53-4724-a217-4d9b85e7c1db-config\") pod \"kube-controller-manager-operator-ff989d6cc-5gvgh\" (UID: \"dbcbba74-ac53-4724-a217-4d9b85e7c1db\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.601094 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e45616db-f7dd-4a08-847f-abf2759d9fa4-trusted-ca-bundle\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.601140 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-node-log\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.601162 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/034cad93-a500-4c58-8d97-fa49866a0d5e-snapshots\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.601182 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-serving-cert\") pod \"route-controller-manager-864f875b6b-rcjvd\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.601219 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/163d6a3d-0080-4122-bb7a-17f6e63f00f0-trusted-ca\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.601241 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/376b18a9-5f33-44fd-a37b-20ab02c5e65d-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.601258 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09a22c25-6073-4b1a-a029-928452ef37db-cni-binary-copy\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.601278 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p55f\" (UniqueName: \"kubernetes.io/projected/bb22a965-9b36-40cd-993d-747a3978be8e-kube-api-access-5p55f\") pod \"cluster-samples-operator-85f7577d78-ssxxd\" (UID: \"bb22a965-9b36-40cd-993d-747a3978be8e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-ssxxd" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.601317 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-systemd\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.601338 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnl28\" (UniqueName: \"kubernetes.io/projected/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-kube-api-access-dnl28\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.601381 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/daf4dbb6-5a0a-4c92-a930-479a7330ace1-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.601401 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c576a88-6da4-43e9-a373-0df27a029f59-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.601469 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8438d015-106b-4aed-ae12-dda781ce51fc-env-overrides\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.601491 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.601510 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.601551 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-ovnkube-script-lib\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.605044 master-0 kubenswrapper[17644]: I0319 11:59:32.601571 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jptl6\" (UniqueName: \"kubernetes.io/projected/034cad93-a500-4c58-8d97-fa49866a0d5e-kube-api-access-jptl6\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.601590 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-mhvls\" (UID: \"0cbbe8d0-aafb-499f-a1f4-affcea62c1ab\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.601646 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srlcl\" (UniqueName: \"kubernetes.io/projected/1b94d1eb-1b80-4a14-b1c0-d9e192231352-kube-api-access-srlcl\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.601668 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nds54\" (UniqueName: \"kubernetes.io/projected/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-kube-api-access-nds54\") pod \"openshift-controller-manager-operator-8c94f4649-6ghdm\" (UID: \"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.601706 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/376b18a9-5f33-44fd-a37b-20ab02c5e65d-cache\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.601754 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.601775 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46m89\" (UniqueName: \"kubernetes.io/projected/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-kube-api-access-46m89\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.601801 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c576a88-6da4-43e9-a373-0df27a029f59-var-lock\") pod \"installer-3-master-0\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.601842 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b5c7eb66-e23e-40df-883c-fed012c07f26-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-66wvv\" (UID: \"b5c7eb66-e23e-40df-883c-fed012c07f26\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.601866 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgz7q\" (UniqueName: \"kubernetes.io/projected/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-kube-api-access-kgz7q\") pod \"iptables-alerter-n52gc\" (UID: \"4e2c195f-e97d-4cac-81fc-2d5c551d1c30\") " pod="openshift-network-operator/iptables-alerter-n52gc" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.601890 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.601934 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5jsb\" (UniqueName: \"kubernetes.io/projected/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-kube-api-access-p5jsb\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.602190 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-config\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.602383 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf6aab0e-defc-4a4b-8a07-f5af8bf177c4-utilities\") pod \"redhat-marketplace-ccbc5\" (UID: \"cf6aab0e-defc-4a4b-8a07-f5af8bf177c4\") " pod="openshift-marketplace/redhat-marketplace-ccbc5" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.602412 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx4wk\" (UniqueName: \"kubernetes.io/projected/09a22c25-6073-4b1a-a029-928452ef37db-kube-api-access-xx4wk\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.602467 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39d3ac31-9259-454b-8e1c-e23024f8f2b2-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-pjc7h\" (UID: \"39d3ac31-9259-454b-8e1c-e23024f8f2b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.602582 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w\" (UID: \"b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.602649 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1b94d1eb-1b80-4a14-b1c0-d9e192231352-cache\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.602816 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p4hg\" (UniqueName: \"kubernetes.io/projected/6611e325-6152-480c-9c2c-1b503e49ccd2-kube-api-access-4p4hg\") pod \"cluster-olm-operator-67dcd4998-rgbzk\" (UID: \"6611e325-6152-480c-9c2c-1b503e49ccd2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.602847 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9ddk\" (UniqueName: \"kubernetes.io/projected/00dd3703-af25-4e71-b20b-b3e153383489-kube-api-access-k9ddk\") pod \"certified-operators-gwt6h\" (UID: \"00dd3703-af25-4e71-b20b-b3e153383489\") " pod="openshift-marketplace/certified-operators-gwt6h" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.602863 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3c3b0d24-ce5e-49c3-a546-874356f75dc6-metrics-tls\") pod \"network-operator-7bd846bfc4-7fz6w\" (UID: \"3c3b0d24-ce5e-49c3-a546-874356f75dc6\") " pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.602890 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6e76fc3f-39a4-4f99-8603-38a94da6ea8e-signing-key\") pod \"service-ca-79bc6b8d76-lzfbh\" (UID: \"6e76fc3f-39a4-4f99-8603-38a94da6ea8e\") " pod="openshift-service-ca/service-ca-79bc6b8d76-lzfbh" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.602915 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.602937 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e48b5aa9-293e-4222-91ff-7640addeca4c-encryption-config\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.602970 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.602990 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-host\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.603051 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.603067 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1eef757-d63a-4708-8efe-7b27eea1ff63-utilities\") pod \"community-operators-h668l\" (UID: \"d1eef757-d63a-4708-8efe-7b27eea1ff63\") " pod="openshift-marketplace/community-operators-h668l" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.603114 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-4wj9n\" (UID: \"9b61ea14-a7ea-49f3-9df4-5655765ddf7c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.603181 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cf08ab4f-c203-4c16-9826-8cc049f4af31-srv-cert\") pod \"catalog-operator-68f85b4d6c-n5gr9\" (UID: \"cf08ab4f-c203-4c16-9826-8cc049f4af31\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.603188 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf6aab0e-defc-4a4b-8a07-f5af8bf177c4-utilities\") pod \"redhat-marketplace-ccbc5\" (UID: \"cf6aab0e-defc-4a4b-8a07-f5af8bf177c4\") " pod="openshift-marketplace/redhat-marketplace-ccbc5" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.603253 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/24f71770-714e-4111-9188-ad8663c6baa7-rootfs\") pod \"machine-config-daemon-mgzld\" (UID: \"24f71770-714e-4111-9188-ad8663c6baa7\") " pod="openshift-machine-config-operator/machine-config-daemon-mgzld" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.603319 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00dd3703-af25-4e71-b20b-b3e153383489-utilities\") pod \"certified-operators-gwt6h\" (UID: \"00dd3703-af25-4e71-b20b-b3e153383489\") " pod="openshift-marketplace/certified-operators-gwt6h" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.603339 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.603387 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-config\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.603414 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.603436 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1eef757-d63a-4708-8efe-7b27eea1ff63-catalog-content\") pod \"community-operators-h668l\" (UID: \"d1eef757-d63a-4708-8efe-7b27eea1ff63\") " pod="openshift-marketplace/community-operators-h668l" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.603479 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6e76fc3f-39a4-4f99-8603-38a94da6ea8e-signing-key\") pod \"service-ca-79bc6b8d76-lzfbh\" (UID: \"6e76fc3f-39a4-4f99-8603-38a94da6ea8e\") " pod="openshift-service-ca/service-ca-79bc6b8d76-lzfbh" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.603543 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1eef757-d63a-4708-8efe-7b27eea1ff63-catalog-content\") pod \"community-operators-h668l\" (UID: \"d1eef757-d63a-4708-8efe-7b27eea1ff63\") " pod="openshift-marketplace/community-operators-h668l" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.603545 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v27lg\" (UniqueName: \"kubernetes.io/projected/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-kube-api-access-v27lg\") pod \"service-ca-operator-b865698dc-md7m5\" (UID: \"f5d73fef-1414-4b29-97ea-42e1c0b1ef18\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.603635 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w\" (UID: \"b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.603648 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22e10648-af7c-409e-b947-570e7d807e05-metrics-tls\") pod \"dns-operator-9c5679d8f-965np\" (UID: \"22e10648-af7c-409e-b947-570e7d807e05\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.603922 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-textfile\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.603935 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6d41245b-33d4-40f8-bbe1-6d2247e2e335-tmpfs\") pod \"packageserver-bbf67c86c-n58nq\" (UID: \"6d41245b-33d4-40f8-bbe1-6d2247e2e335\") " pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.603943 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.604011 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-volume-directive-shadow\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.604311 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.604296 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cnibin\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.604335 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.604355 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e45616db-f7dd-4a08-847f-abf2759d9fa4-etcd-client\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.604367 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8438d015-106b-4aed-ae12-dda781ce51fc-webhook-cert\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:59:32.609323 master-0 kubenswrapper[17644]: I0319 11:59:32.604367 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-serving-cert\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:59:32.612748 master-0 kubenswrapper[17644]: I0319 11:59:32.611368 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/732989c5-1b89-46f0-9917-b68613f7f005-serving-cert\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:59:32.612748 master-0 kubenswrapper[17644]: I0319 11:59:32.611554 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8376e1f9-ab05-42d4-aa66-284a167a9bfc-tmp\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.612748 master-0 kubenswrapper[17644]: I0319 11:59:32.611581 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f29b11ce-60e0-46b3-8d28-eea3452513cd-metrics-certs\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:59:32.612748 master-0 kubenswrapper[17644]: I0319 11:59:32.612091 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbcbba74-ac53-4724-a217-4d9b85e7c1db-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-5gvgh\" (UID: \"dbcbba74-ac53-4724-a217-4d9b85e7c1db\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" Mar 19 11:59:32.612748 master-0 kubenswrapper[17644]: I0319 11:59:32.612250 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf6aab0e-defc-4a4b-8a07-f5af8bf177c4-catalog-content\") pod \"redhat-marketplace-ccbc5\" (UID: \"cf6aab0e-defc-4a4b-8a07-f5af8bf177c4\") " pod="openshift-marketplace/redhat-marketplace-ccbc5" Mar 19 11:59:32.612748 master-0 kubenswrapper[17644]: I0319 11:59:32.612397 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-6ghdm\" (UID: \"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" Mar 19 11:59:32.612748 master-0 kubenswrapper[17644]: I0319 11:59:32.612488 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-tuned\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.612748 master-0 kubenswrapper[17644]: I0319 11:59:32.612678 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66f88242-8b0b-4790-bbb6-445c19b34ee7-config\") pod \"openshift-apiserver-operator-d65958b8-6hsqn\" (UID: \"66f88242-8b0b-4790-bbb6-445c19b34ee7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" Mar 19 11:59:32.613635 master-0 kubenswrapper[17644]: I0319 11:59:32.613056 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-image-import-ca\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:32.613635 master-0 kubenswrapper[17644]: I0319 11:59:32.613367 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:59:32.613635 master-0 kubenswrapper[17644]: I0319 11:59:32.613400 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a3ceeece-bee9-4fcb-8517-95ebce38e223-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-ng9ss\" (UID: \"a3ceeece-bee9-4fcb-8517-95ebce38e223\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:59:32.613857 master-0 kubenswrapper[17644]: I0319 11:59:32.613778 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:59:32.613948 master-0 kubenswrapper[17644]: I0319 11:59:32.613917 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/732989c5-1b89-46f0-9917-b68613f7f005-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:59:32.614389 master-0 kubenswrapper[17644]: I0319 11:59:32.614196 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 11:59:32.614389 master-0 kubenswrapper[17644]: I0319 11:59:32.614316 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-serving-cert\") pod \"service-ca-operator-b865698dc-md7m5\" (UID: \"f5d73fef-1414-4b29-97ea-42e1c0b1ef18\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" Mar 19 11:59:32.615419 master-0 kubenswrapper[17644]: I0319 11:59:32.614394 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert\") pod \"olm-operator-5c9796789-l9sw9\" (UID: \"716c2176-50f9-4c4f-af0e-4c7973457df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:59:32.615419 master-0 kubenswrapper[17644]: I0319 11:59:32.614711 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5078f17-bc65-460f-9f18-8c506db6840b-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-jq5vq\" (UID: \"e5078f17-bc65-460f-9f18-8c506db6840b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:59:32.615419 master-0 kubenswrapper[17644]: I0319 11:59:32.614857 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3053504d-0734-4def-b639-0f5cc2178185-ovn-node-metrics-cert\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.615419 master-0 kubenswrapper[17644]: I0319 11:59:32.614862 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/6611e325-6152-480c-9c2c-1b503e49ccd2-operand-assets\") pod \"cluster-olm-operator-67dcd4998-rgbzk\" (UID: \"6611e325-6152-480c-9c2c-1b503e49ccd2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" Mar 19 11:59:32.615419 master-0 kubenswrapper[17644]: I0319 11:59:32.614989 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-config\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:32.615419 master-0 kubenswrapper[17644]: I0319 11:59:32.615059 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66f88242-8b0b-4790-bbb6-445c19b34ee7-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-6hsqn\" (UID: \"66f88242-8b0b-4790-bbb6-445c19b34ee7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" Mar 19 11:59:32.615419 master-0 kubenswrapper[17644]: I0319 11:59:32.615100 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e48b5aa9-293e-4222-91ff-7640addeca4c-encryption-config\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:32.615419 master-0 kubenswrapper[17644]: I0319 11:59:32.615275 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-client\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:59:32.615419 master-0 kubenswrapper[17644]: I0319 11:59:32.615381 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e48b5aa9-293e-4222-91ff-7640addeca4c-serving-cert\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:32.615964 master-0 kubenswrapper[17644]: I0319 11:59:32.615663 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-config\") pod \"service-ca-operator-b865698dc-md7m5\" (UID: \"f5d73fef-1414-4b29-97ea-42e1c0b1ef18\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.616185 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00dd3703-af25-4e71-b20b-b3e153383489-catalog-content\") pod \"certified-operators-gwt6h\" (UID: \"00dd3703-af25-4e71-b20b-b3e153383489\") " pod="openshift-marketplace/certified-operators-gwt6h" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.616381 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-config\") pod \"openshift-kube-scheduler-operator-dddff6458-4wj9n\" (UID: \"9b61ea14-a7ea-49f3-9df4-5655765ddf7c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.616479 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/daf4dbb6-5a0a-4c92-a930-479a7330ace1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.616601 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/376b18a9-5f33-44fd-a37b-20ab02c5e65d-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.616661 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-audit\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.616684 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.616774 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103-utilities\") pod \"redhat-operators-w2fqh\" (UID: \"cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103\") " pod="openshift-marketplace/redhat-operators-w2fqh" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.616906 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09a22c25-6073-4b1a-a029-928452ef37db-cni-binary-copy\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.616965 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/6611e325-6152-480c-9c2c-1b503e49ccd2-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-rgbzk\" (UID: \"6611e325-6152-480c-9c2c-1b503e49ccd2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.617143 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/5f8c022c-7871-4765-971f-dcafa39357c9-audit-log\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.617154 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-wdwkz\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.617157 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-config\") pod \"openshift-controller-manager-operator-8c94f4649-6ghdm\" (UID: \"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.617173 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/163d6a3d-0080-4122-bb7a-17f6e63f00f0-metrics-tls\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.617302 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/376b18a9-5f33-44fd-a37b-20ab02c5e65d-cache\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.617347 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-etcd-serving-ca\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.617375 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-config\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.617234 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.617550 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.617560 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-etcd-ca\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.617567 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cni-binary-copy\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.617608 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3ceeece-bee9-4fcb-8517-95ebce38e223-serving-cert\") pod \"openshift-config-operator-95bf4f4d-ng9ss\" (UID: \"a3ceeece-bee9-4fcb-8517-95ebce38e223\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.617679 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e48b5aa9-293e-4222-91ff-7640addeca4c-etcd-client\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.617711 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e2ad29ad-70ef-43c6-91f6-02f04d145673-stats-auth\") pod \"router-default-7dcf5569b5-kpmgt\" (UID: \"e2ad29ad-70ef-43c6-91f6-02f04d145673\") " pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.617769 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103-catalog-content\") pod \"redhat-operators-w2fqh\" (UID: \"cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103\") " pod="openshift-marketplace/redhat-operators-w2fqh" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.617788 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-ovnkube-script-lib\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.617809 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e48b5aa9-293e-4222-91ff-7640addeca4c-node-pullsecrets\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.617844 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-ovnkube-config\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.617902 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.617909 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/034cad93-a500-4c58-8d97-fa49866a0d5e-snapshots\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.618071 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/daf4dbb6-5a0a-4c92-a930-479a7330ace1-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.618085 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/716c2176-50f9-4c4f-af0e-4c7973457df2-srv-cert\") pod \"olm-operator-5c9796789-l9sw9\" (UID: \"716c2176-50f9-4c4f-af0e-4c7973457df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.618138 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/09a22c25-6073-4b1a-a029-928452ef37db-multus-daemon-config\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.618174 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbcbba74-ac53-4724-a217-4d9b85e7c1db-config\") pod \"kube-controller-manager-operator-ff989d6cc-5gvgh\" (UID: \"dbcbba74-ac53-4724-a217-4d9b85e7c1db\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.618178 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/163d6a3d-0080-4122-bb7a-17f6e63f00f0-trusted-ca\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.618208 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103-catalog-content\") pod \"redhat-operators-w2fqh\" (UID: \"cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103\") " pod="openshift-marketplace/redhat-operators-w2fqh" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.618235 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-cni-bin\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.618279 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6870ccc7-2094-48d8-9238-f486a4b8d5af-node-bootstrap-token\") pod \"machine-config-server-ltk8s\" (UID: \"6870ccc7-2094-48d8-9238-f486a4b8d5af\") " pod="openshift-machine-config-operator/machine-config-server-ltk8s" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.618354 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-ovnkube-config\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.618406 master-0 kubenswrapper[17644]: I0319 11:59:32.618369 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3de8a1b-a5be-414f-86e8-738e16c8bc97-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:59:32.619504 master-0 kubenswrapper[17644]: I0319 11:59:32.618534 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqcvx\" (UniqueName: \"kubernetes.io/projected/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-kube-api-access-lqcvx\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:59:32.619504 master-0 kubenswrapper[17644]: I0319 11:59:32.618633 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e45616db-f7dd-4a08-847f-abf2759d9fa4-audit-dir\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:59:32.619504 master-0 kubenswrapper[17644]: I0319 11:59:32.618768 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgzdh\" (UniqueName: \"kubernetes.io/projected/d625c81e-01cc-424a-997d-546a5204a72b-kube-api-access-tgzdh\") pod \"csi-snapshot-controller-64854d9cff-764k4\" (UID: \"d625c81e-01cc-424a-997d-546a5204a72b\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" Mar 19 11:59:32.619504 master-0 kubenswrapper[17644]: I0319 11:59:32.618887 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:59:32.619504 master-0 kubenswrapper[17644]: I0319 11:59:32.618966 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-netns\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.619504 master-0 kubenswrapper[17644]: I0319 11:59:32.618995 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:59:32.619504 master-0 kubenswrapper[17644]: I0319 11:59:32.619050 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dbmq\" (UniqueName: \"kubernetes.io/projected/52bdf7cc-f07d-487e-937c-6567f194947e-kube-api-access-8dbmq\") pod \"cluster-storage-operator-7d87854d6-htdhf\" (UID: \"52bdf7cc-f07d-487e-937c-6567f194947e\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-htdhf" Mar 19 11:59:32.619504 master-0 kubenswrapper[17644]: I0319 11:59:32.619078 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6e76fc3f-39a4-4f99-8603-38a94da6ea8e-signing-cabundle\") pod \"service-ca-79bc6b8d76-lzfbh\" (UID: \"6e76fc3f-39a4-4f99-8603-38a94da6ea8e\") " pod="openshift-service-ca/service-ca-79bc6b8d76-lzfbh" Mar 19 11:59:32.619504 master-0 kubenswrapper[17644]: I0319 11:59:32.619193 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e45616db-f7dd-4a08-847f-abf2759d9fa4-etcd-serving-ca\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:59:32.619504 master-0 kubenswrapper[17644]: I0319 11:59:32.619202 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e48b5aa9-293e-4222-91ff-7640addeca4c-trusted-ca-bundle\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:32.619504 master-0 kubenswrapper[17644]: I0319 11:59:32.619227 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-client-ca-bundle\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:32.619504 master-0 kubenswrapper[17644]: I0319 11:59:32.619358 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:59:32.619504 master-0 kubenswrapper[17644]: I0319 11:59:32.619521 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6e76fc3f-39a4-4f99-8603-38a94da6ea8e-signing-cabundle\") pod \"service-ca-79bc6b8d76-lzfbh\" (UID: \"6e76fc3f-39a4-4f99-8603-38a94da6ea8e\") " pod="openshift-service-ca/service-ca-79bc6b8d76-lzfbh" Mar 19 11:59:32.635508 master-0 kubenswrapper[17644]: I0319 11:59:32.635154 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 11:59:32.635508 master-0 kubenswrapper[17644]: I0319 11:59:32.635418 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3053504d-0734-4def-b639-0f5cc2178185-env-overrides\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.641761 master-0 kubenswrapper[17644]: I0319 11:59:32.641712 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/daf4dbb6-5a0a-4c92-a930-479a7330ace1-env-overrides\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:59:32.656011 master-0 kubenswrapper[17644]: I0319 11:59:32.655328 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 11:59:32.677588 master-0 kubenswrapper[17644]: I0319 11:59:32.677432 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 11:59:32.696756 master-0 kubenswrapper[17644]: I0319 11:59:32.694079 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 11:59:32.702340 master-0 kubenswrapper[17644]: I0319 11:59:32.702289 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39d3ac31-9259-454b-8e1c-e23024f8f2b2-config\") pod \"kube-apiserver-operator-8b68b9d9b-pjc7h\" (UID: \"39d3ac31-9259-454b-8e1c-e23024f8f2b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" Mar 19 11:59:32.717796 master-0 kubenswrapper[17644]: I0319 11:59:32.715842 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 11:59:32.720809 master-0 kubenswrapper[17644]: I0319 11:59:32.720152 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.720809 master-0 kubenswrapper[17644]: I0319 11:59:32.720202 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-host\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.720809 master-0 kubenswrapper[17644]: I0319 11:59:32.720222 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/24f71770-714e-4111-9188-ad8663c6baa7-rootfs\") pod \"machine-config-daemon-mgzld\" (UID: \"24f71770-714e-4111-9188-ad8663c6baa7\") " pod="openshift-machine-config-operator/machine-config-daemon-mgzld" Mar 19 11:59:32.720809 master-0 kubenswrapper[17644]: I0319 11:59:32.720246 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cnibin\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:59:32.720809 master-0 kubenswrapper[17644]: I0319 11:59:32.720301 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e45616db-f7dd-4a08-847f-abf2759d9fa4-audit-dir\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:59:32.720809 master-0 kubenswrapper[17644]: I0319 11:59:32.720319 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e48b5aa9-293e-4222-91ff-7640addeca4c-node-pullsecrets\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:32.720809 master-0 kubenswrapper[17644]: I0319 11:59:32.720340 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-cni-bin\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.720809 master-0 kubenswrapper[17644]: I0319 11:59:32.720394 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-netns\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.720809 master-0 kubenswrapper[17644]: I0319 11:59:32.720492 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-run-netns\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.720809 master-0 kubenswrapper[17644]: I0319 11:59:32.720517 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dd6ec279-d92f-45c2-97c2-88b96fbd6600-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:59:32.720809 master-0 kubenswrapper[17644]: I0319 11:59:32.720555 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/376b18a9-5f33-44fd-a37b-20ab02c5e65d-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:59:32.720809 master-0 kubenswrapper[17644]: I0319 11:59:32.720597 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-sys\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.720809 master-0 kubenswrapper[17644]: I0319 11:59:32.720616 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3c3b0d24-ce5e-49c3-a546-874356f75dc6-host-etc-kube\") pod \"network-operator-7bd846bfc4-7fz6w\" (UID: \"3c3b0d24-ce5e-49c3-a546-874356f75dc6\") " pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" Mar 19 11:59:32.720809 master-0 kubenswrapper[17644]: I0319 11:59:32.720640 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/376b18a9-5f33-44fd-a37b-20ab02c5e65d-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:59:32.720809 master-0 kubenswrapper[17644]: I0319 11:59:32.720687 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-modprobe-d\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.720809 master-0 kubenswrapper[17644]: I0319 11:59:32.720703 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-kubelet\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.720809 master-0 kubenswrapper[17644]: I0319 11:59:32.720721 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-conf-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.720809 master-0 kubenswrapper[17644]: I0319 11:59:32.720761 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-sysctl-conf\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.720809 master-0 kubenswrapper[17644]: I0319 11:59:32.720795 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-cnibin\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.720809 master-0 kubenswrapper[17644]: I0319 11:59:32.720816 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d06b230b-db67-4afc-8d10-2c33ad568462-sys\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:59:32.720809 master-0 kubenswrapper[17644]: I0319 11:59:32.720836 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-sysconfig\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.720925 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-var-lib-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.720968 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-etc-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.721041 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-kubelet\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.721057 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-run\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.721079 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.721157 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-ovn\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.721174 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1c898657-f06b-44ab-95ff-53a324759ba1-hosts-file\") pod \"node-resolver-pm77f\" (UID: \"1c898657-f06b-44ab-95ff-53a324759ba1\") " pod="openshift-dns/node-resolver-pm77f" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.721208 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-os-release\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.721237 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1b94d1eb-1b80-4a14-b1c0-d9e192231352-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.721262 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-cni-multus\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.721289 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-system-cni-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.721313 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-var-lib-kubelet\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.721343 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1b94d1eb-1b80-4a14-b1c0-d9e192231352-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.721410 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-k8s-cni-cncf-io\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.721426 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d06b230b-db67-4afc-8d10-2c33ad568462-root\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.721457 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-cni-netd\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.721481 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-lib-modules\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.721513 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-cni-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.721530 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-etc-kubernetes\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.721566 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.721582 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-systemd\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.721666 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-cni-bin\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.721685 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-multus-certs\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.721703 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access\") pod \"installer-3-master-0\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.721737 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-kubernetes\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.721757 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-system-cni-dir\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:59:32.721787 master-0 kubenswrapper[17644]: I0319 11:59:32.721783 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-systemd-units\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.721906 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-cni-multus\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.721949 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-run-netns\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.721972 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-sys\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722013 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-systemd-units\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722055 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-system-cni-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722054 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dd6ec279-d92f-45c2-97c2-88b96fbd6600-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722104 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1b94d1eb-1b80-4a14-b1c0-d9e192231352-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722109 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-var-lib-kubelet\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722110 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/376b18a9-5f33-44fd-a37b-20ab02c5e65d-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722175 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-multus-certs\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722133 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1b94d1eb-1b80-4a14-b1c0-d9e192231352-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722179 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-cni-netd\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722016 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/376b18a9-5f33-44fd-a37b-20ab02c5e65d-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722310 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-modprobe-d\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722317 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-lib-modules\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722329 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-kubernetes\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722350 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-kubelet\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722354 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-etc-kubernetes\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722382 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-cni-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722386 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722416 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-conf-dir\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722439 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-systemd\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722446 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-cnibin\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722471 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-host\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722481 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-sysconfig\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722493 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d06b230b-db67-4afc-8d10-2c33ad568462-root\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722520 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-system-cni-dir\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722521 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-cnibin\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722558 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-sysctl-conf\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722560 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-slash\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722579 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-os-release\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722606 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e45616db-f7dd-4a08-847f-abf2759d9fa4-audit-dir\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722606 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722625 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-kubelet\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722649 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e48b5aa9-293e-4222-91ff-7640addeca4c-node-pullsecrets\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722655 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-var-lib-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722665 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d06b230b-db67-4afc-8d10-2c33ad568462-sys\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722667 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-cni-bin\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722680 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722695 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-socket-dir-parent\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722704 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1c898657-f06b-44ab-95ff-53a324759ba1-hosts-file\") pod \"node-resolver-pm77f\" (UID: \"1c898657-f06b-44ab-95ff-53a324759ba1\") " pod="openshift-dns/node-resolver-pm77f" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722700 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-etc-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722722 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-var-lib-cni-bin\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.722742 master-0 kubenswrapper[17644]: I0319 11:59:32.722751 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-k8s-cni-cncf-io\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.722845 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-run\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.722871 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/24f71770-714e-4111-9188-ad8663c6baa7-rootfs\") pod \"machine-config-daemon-mgzld\" (UID: \"24f71770-714e-4111-9188-ad8663c6baa7\") " pod="openshift-machine-config-operator/machine-config-daemon-mgzld" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.722894 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-ovn\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.722927 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3c3b0d24-ce5e-49c3-a546-874356f75dc6-host-etc-kube\") pod \"network-operator-7bd846bfc4-7fz6w\" (UID: \"3c3b0d24-ce5e-49c3-a546-874356f75dc6\") " pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.722949 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-slash\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723002 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-sysctl-d\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723035 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-wtmp\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723062 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-host-slash\") pod \"iptables-alerter-n52gc\" (UID: \"4e2c195f-e97d-4cac-81fc-2d5c551d1c30\") " pod="openshift-network-operator/iptables-alerter-n52gc" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723079 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723122 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-log-socket\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723180 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e48b5aa9-293e-4222-91ff-7640addeca4c-audit-dir\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723247 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-run-ovn-kubernetes\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723263 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-hostroot\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723280 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-os-release\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723310 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dd6ec279-d92f-45c2-97c2-88b96fbd6600-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723359 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-node-log\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723398 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-systemd\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723446 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c576a88-6da4-43e9-a373-0df27a029f59-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723496 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c576a88-6da4-43e9-a373-0df27a029f59-var-lock\") pod \"installer-3-master-0\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723623 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-multus-socket-dir-parent\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723647 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-host-run-netns\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723677 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-sysctl-d\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723705 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-wtmp\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723747 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-host-slash\") pod \"iptables-alerter-n52gc\" (UID: \"4e2c195f-e97d-4cac-81fc-2d5c551d1c30\") " pod="openshift-network-operator/iptables-alerter-n52gc" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723768 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-run-openvswitch\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723789 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-log-socket\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723811 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e48b5aa9-293e-4222-91ff-7640addeca4c-audit-dir\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723833 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-host-run-ovn-kubernetes\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723853 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-hostroot\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723884 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dd6ec279-d92f-45c2-97c2-88b96fbd6600-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723903 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c576a88-6da4-43e9-a373-0df27a029f59-var-lock\") pod \"installer-3-master-0\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723903 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8376e1f9-ab05-42d4-aa66-284a167a9bfc-etc-systemd\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723933 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3053504d-0734-4def-b639-0f5cc2178185-node-log\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.723958 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09a22c25-6073-4b1a-a029-928452ef37db-os-release\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:32.724455 master-0 kubenswrapper[17644]: I0319 11:59:32.724002 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c576a88-6da4-43e9-a373-0df27a029f59-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:59:32.735676 master-0 kubenswrapper[17644]: I0319 11:59:32.734925 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 19 11:59:32.742362 master-0 kubenswrapper[17644]: I0319 11:59:32.742312 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:59:32.755491 master-0 kubenswrapper[17644]: I0319 11:59:32.755409 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 19 11:59:32.773261 master-0 kubenswrapper[17644]: I0319 11:59:32.773205 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 11:59:32.779073 master-0 kubenswrapper[17644]: I0319 11:59:32.779016 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8438d015-106b-4aed-ae12-dda781ce51fc-env-overrides\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:59:32.794012 master-0 kubenswrapper[17644]: I0319 11:59:32.793957 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 11:59:32.796660 master-0 kubenswrapper[17644]: I0319 11:59:32.796620 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/8438d015-106b-4aed-ae12-dda781ce51fc-ovnkube-identity-cm\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:59:32.814290 master-0 kubenswrapper[17644]: I0319 11:59:32.814226 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 11:59:32.833561 master-0 kubenswrapper[17644]: I0319 11:59:32.833500 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 11:59:32.854135 master-0 kubenswrapper[17644]: I0319 11:59:32.854078 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 11:59:32.856704 master-0 kubenswrapper[17644]: I0319 11:59:32.856660 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e45616db-f7dd-4a08-847f-abf2759d9fa4-etcd-client\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:59:32.875184 master-0 kubenswrapper[17644]: I0319 11:59:32.875127 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 11:59:32.885317 master-0 kubenswrapper[17644]: I0319 11:59:32.885256 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-iptables-alerter-script\") pod \"iptables-alerter-n52gc\" (UID: \"4e2c195f-e97d-4cac-81fc-2d5c551d1c30\") " pod="openshift-network-operator/iptables-alerter-n52gc" Mar 19 11:59:32.894950 master-0 kubenswrapper[17644]: I0319 11:59:32.894870 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 11:59:32.906113 master-0 kubenswrapper[17644]: I0319 11:59:32.906045 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c-metrics-tls\") pod \"dns-default-ztgjs\" (UID: \"1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c\") " pod="openshift-dns/dns-default-ztgjs" Mar 19 11:59:32.918785 master-0 kubenswrapper[17644]: I0319 11:59:32.918737 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 11:59:32.929312 master-0 kubenswrapper[17644]: I0319 11:59:32.928362 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e45616db-f7dd-4a08-847f-abf2759d9fa4-serving-cert\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:59:32.939413 master-0 kubenswrapper[17644]: I0319 11:59:32.939347 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 11:59:32.946651 master-0 kubenswrapper[17644]: I0319 11:59:32.946587 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2ad29ad-70ef-43c6-91f6-02f04d145673-metrics-certs\") pod \"router-default-7dcf5569b5-kpmgt\" (UID: \"e2ad29ad-70ef-43c6-91f6-02f04d145673\") " pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:59:32.956273 master-0 kubenswrapper[17644]: I0319 11:59:32.956213 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 11:59:32.965368 master-0 kubenswrapper[17644]: I0319 11:59:32.965311 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e2ad29ad-70ef-43c6-91f6-02f04d145673-default-certificate\") pod \"router-default-7dcf5569b5-kpmgt\" (UID: \"e2ad29ad-70ef-43c6-91f6-02f04d145673\") " pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:59:32.974645 master-0 kubenswrapper[17644]: I0319 11:59:32.974570 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 11:59:32.978901 master-0 kubenswrapper[17644]: I0319 11:59:32.978839 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e2ad29ad-70ef-43c6-91f6-02f04d145673-stats-auth\") pod \"router-default-7dcf5569b5-kpmgt\" (UID: \"e2ad29ad-70ef-43c6-91f6-02f04d145673\") " pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:59:32.994503 master-0 kubenswrapper[17644]: I0319 11:59:32.994434 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 11:59:32.996689 master-0 kubenswrapper[17644]: I0319 11:59:32.996641 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e45616db-f7dd-4a08-847f-abf2759d9fa4-encryption-config\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:59:33.015089 master-0 kubenswrapper[17644]: I0319 11:59:33.015034 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 11:59:33.034771 master-0 kubenswrapper[17644]: I0319 11:59:33.034692 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 11:59:33.043717 master-0 kubenswrapper[17644]: I0319 11:59:33.043655 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd6ec279-d92f-45c2-97c2-88b96fbd6600-serving-cert\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:59:33.055423 master-0 kubenswrapper[17644]: I0319 11:59:33.055370 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 11:59:33.063497 master-0 kubenswrapper[17644]: I0319 11:59:33.063448 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7a51eeaf-1349-4bf3-932d-22ed5ce7c161-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6f97756bc8-j7rc9\" (UID: \"7a51eeaf-1349-4bf3-932d-22ed5ce7c161\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-j7rc9" Mar 19 11:59:33.073566 master-0 kubenswrapper[17644]: I0319 11:59:33.073337 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-7gswr" Mar 19 11:59:33.094672 master-0 kubenswrapper[17644]: I0319 11:59:33.094597 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-sx7wj" Mar 19 11:59:33.114439 master-0 kubenswrapper[17644]: I0319 11:59:33.114391 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 19 11:59:33.117266 master-0 kubenswrapper[17644]: I0319 11:59:33.117204 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9778f8f5-b0d1-4967-9776-9db758bba3af-tls-certificates\") pod \"prometheus-operator-admission-webhook-69c6b55594-89rdt\" (UID: \"9778f8f5-b0d1-4967-9776-9db758bba3af\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-89rdt" Mar 19 11:59:33.133798 master-0 kubenswrapper[17644]: I0319 11:59:33.133718 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 11:59:33.134444 master-0 kubenswrapper[17644]: I0319 11:59:33.134408 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76cf2b01-33d9-47eb-be5d-44946c78bf20-serving-cert\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:59:33.154540 master-0 kubenswrapper[17644]: I0319 11:59:33.154494 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 11:59:33.157899 master-0 kubenswrapper[17644]: I0319 11:59:33.157869 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-serving-cert\") pod \"route-controller-manager-864f875b6b-rcjvd\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:59:33.175738 master-0 kubenswrapper[17644]: I0319 11:59:33.175660 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-6nq75" Mar 19 11:59:33.179140 master-0 kubenswrapper[17644]: I0319 11:59:33.179085 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b45ea2ef1cf2bc9d1d994d6538ae0a64/kube-apiserver-check-endpoints/0.log" Mar 19 11:59:33.181212 master-0 kubenswrapper[17644]: I0319 11:59:33.181112 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:59:33.192433 master-0 kubenswrapper[17644]: I0319 11:59:33.192388 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:59:33.193127 master-0 kubenswrapper[17644]: I0319 11:59:33.193100 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-qsmbf" Mar 19 11:59:33.216609 master-0 kubenswrapper[17644]: I0319 11:59:33.216544 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 11:59:33.218278 master-0 kubenswrapper[17644]: I0319 11:59:33.218248 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e45616db-f7dd-4a08-847f-abf2759d9fa4-trusted-ca-bundle\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:59:33.234979 master-0 kubenswrapper[17644]: I0319 11:59:33.234931 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 11:59:33.243603 master-0 kubenswrapper[17644]: I0319 11:59:33.243504 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e45616db-f7dd-4a08-847f-abf2759d9fa4-audit-policies\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:59:33.254456 master-0 kubenswrapper[17644]: I0319 11:59:33.254393 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 11:59:33.260559 master-0 kubenswrapper[17644]: I0319 11:59:33.260487 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e45616db-f7dd-4a08-847f-abf2759d9fa4-etcd-serving-ca\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:59:33.273909 master-0 kubenswrapper[17644]: I0319 11:59:33.273854 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 11:59:33.293662 master-0 kubenswrapper[17644]: I0319 11:59:33.293578 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 11:59:33.313867 master-0 kubenswrapper[17644]: I0319 11:59:33.313795 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-6zbld" Mar 19 11:59:33.333318 master-0 kubenswrapper[17644]: I0319 11:59:33.333252 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 19 11:59:33.337067 master-0 kubenswrapper[17644]: I0319 11:59:33.337034 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c576a88-6da4-43e9-a373-0df27a029f59-kubelet-dir\") pod \"1c576a88-6da4-43e9-a373-0df27a029f59\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " Mar 19 11:59:33.337221 master-0 kubenswrapper[17644]: I0319 11:59:33.337201 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c576a88-6da4-43e9-a373-0df27a029f59-var-lock\") pod \"1c576a88-6da4-43e9-a373-0df27a029f59\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " Mar 19 11:59:33.337332 master-0 kubenswrapper[17644]: I0319 11:59:33.337253 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c576a88-6da4-43e9-a373-0df27a029f59-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1c576a88-6da4-43e9-a373-0df27a029f59" (UID: "1c576a88-6da4-43e9-a373-0df27a029f59"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:59:33.337381 master-0 kubenswrapper[17644]: I0319 11:59:33.337284 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c576a88-6da4-43e9-a373-0df27a029f59-var-lock" (OuterVolumeSpecName: "var-lock") pod "1c576a88-6da4-43e9-a373-0df27a029f59" (UID: "1c576a88-6da4-43e9-a373-0df27a029f59"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:59:33.338540 master-0 kubenswrapper[17644]: I0319 11:59:33.338490 17644 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c576a88-6da4-43e9-a373-0df27a029f59-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 11:59:33.338662 master-0 kubenswrapper[17644]: I0319 11:59:33.338521 17644 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c576a88-6da4-43e9-a373-0df27a029f59-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:59:33.360669 master-0 kubenswrapper[17644]: I0319 11:59:33.360615 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 19 11:59:33.365996 master-0 kubenswrapper[17644]: I0319 11:59:33.365937 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1b94d1eb-1b80-4a14-b1c0-d9e192231352-ca-certs\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:59:33.374364 master-0 kubenswrapper[17644]: I0319 11:59:33.374301 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-6gp54" Mar 19 11:59:33.394652 master-0 kubenswrapper[17644]: I0319 11:59:33.394597 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 11:59:33.417327 master-0 kubenswrapper[17644]: I0319 11:59:33.417249 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 19 11:59:33.439697 master-0 kubenswrapper[17644]: I0319 11:59:33.439649 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-cjb2h" Mar 19 11:59:33.454217 master-0 kubenswrapper[17644]: I0319 11:59:33.454172 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 11:59:33.473764 master-0 kubenswrapper[17644]: I0319 11:59:33.473693 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 19 11:59:33.492033 master-0 kubenswrapper[17644]: I0319 11:59:33.491969 17644 request.go:700] Waited for 1.004387479s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-catalogd/configmaps?fieldSelector=metadata.name%3Dcatalogd-trusted-ca-bundle&limit=500&resourceVersion=0 Mar 19 11:59:33.511136 master-0 kubenswrapper[17644]: I0319 11:59:33.511054 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 19 11:59:33.515588 master-0 kubenswrapper[17644]: I0319 11:59:33.515543 17644 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:59:33.515652 master-0 kubenswrapper[17644]: I0319 11:59:33.515609 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 19 11:59:33.518206 master-0 kubenswrapper[17644]: I0319 11:59:33.518173 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/376b18a9-5f33-44fd-a37b-20ab02c5e65d-ca-certs\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:59:33.518338 master-0 kubenswrapper[17644]: I0319 11:59:33.518292 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:59:33.518373 master-0 kubenswrapper[17644]: I0319 11:59:33.518351 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:59:33.518373 master-0 kubenswrapper[17644]: I0319 11:59:33.518363 17644 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:59:33.518700 master-0 kubenswrapper[17644]: I0319 11:59:33.518676 17644 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:59:33.539901 master-0 kubenswrapper[17644]: I0319 11:59:33.538208 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 11:59:33.543633 master-0 kubenswrapper[17644]: I0319 11:59:33.543581 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c-config-volume\") pod \"dns-default-ztgjs\" (UID: \"1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c\") " pod="openshift-dns/dns-default-ztgjs" Mar 19 11:59:33.555259 master-0 kubenswrapper[17644]: I0319 11:59:33.555215 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 11:59:33.575450 master-0 kubenswrapper[17644]: I0319 11:59:33.575410 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 11:59:33.585425 master-0 kubenswrapper[17644]: I0319 11:59:33.585373 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2ad29ad-70ef-43c6-91f6-02f04d145673-service-ca-bundle\") pod \"router-default-7dcf5569b5-kpmgt\" (UID: \"e2ad29ad-70ef-43c6-91f6-02f04d145673\") " pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:59:33.593439 master-0 kubenswrapper[17644]: I0319 11:59:33.593409 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 11:59:33.602565 master-0 kubenswrapper[17644]: E0319 11:59:33.602513 17644 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.602900 master-0 kubenswrapper[17644]: E0319 11:59:33.602624 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-client-ca podName:e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.102601983 +0000 UTC m=+7.872560018 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-client-ca") pod "route-controller-manager-864f875b6b-rcjvd" (UID: "e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.602900 master-0 kubenswrapper[17644]: E0319 11:59:33.602520 17644 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.602900 master-0 kubenswrapper[17644]: E0319 11:59:33.602654 17644 secret.go:189] Couldn't get secret openshift-insights/openshift-insights-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.602900 master-0 kubenswrapper[17644]: E0319 11:59:33.602657 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d06b230b-db67-4afc-8d10-2c33ad568462-metrics-client-ca podName:d06b230b-db67-4afc-8d10-2c33ad568462 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.102650344 +0000 UTC m=+7.872608379 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/d06b230b-db67-4afc-8d10-2c33ad568462-metrics-client-ca") pod "node-exporter-pnb9m" (UID: "d06b230b-db67-4afc-8d10-2c33ad568462") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.602900 master-0 kubenswrapper[17644]: E0319 11:59:33.602821 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/034cad93-a500-4c58-8d97-fa49866a0d5e-serving-cert podName:034cad93-a500-4c58-8d97-fa49866a0d5e nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.102762576 +0000 UTC m=+7.872720611 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/034cad93-a500-4c58-8d97-fa49866a0d5e-serving-cert") pod "insights-operator-68bf6ff9d6-djfg8" (UID: "034cad93-a500-4c58-8d97-fa49866a0d5e") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.602900 master-0 kubenswrapper[17644]: E0319 11:59:33.602846 17644 secret.go:189] Couldn't get secret openshift-machine-api/cluster-autoscaler-operator-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.602900 master-0 kubenswrapper[17644]: E0319 11:59:33.602904 17644 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.603172 master-0 kubenswrapper[17644]: E0319 11:59:33.602925 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac09dba7-398c-4b0a-a415-edb73cb4cf30-cert podName:ac09dba7-398c-4b0a-a415-edb73cb4cf30 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.10291198 +0000 UTC m=+7.872870015 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ac09dba7-398c-4b0a-a415-edb73cb4cf30-cert") pod "cluster-autoscaler-operator-866dc4744-dnx7f" (UID: "ac09dba7-398c-4b0a-a415-edb73cb4cf30") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.603172 master-0 kubenswrapper[17644]: E0319 11:59:33.602944 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb22a965-9b36-40cd-993d-747a3978be8e-samples-operator-tls podName:bb22a965-9b36-40cd-993d-747a3978be8e nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.102935521 +0000 UTC m=+7.872893556 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bb22a965-9b36-40cd-993d-747a3978be8e-samples-operator-tls") pod "cluster-samples-operator-85f7577d78-ssxxd" (UID: "bb22a965-9b36-40cd-993d-747a3978be8e") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.603172 master-0 kubenswrapper[17644]: E0319 11:59:33.602996 17644 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.603172 master-0 kubenswrapper[17644]: E0319 11:59:33.603055 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-metrics-client-ca podName:dedf55c4-eeda-4955-aafe-db1fdb8c4a48 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.103043563 +0000 UTC m=+7.873001688 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-metrics-client-ca") pod "openshift-state-metrics-5dc6c74576-lwqmn" (UID: "dedf55c4-eeda-4955-aafe-db1fdb8c4a48") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.603435 master-0 kubenswrapper[17644]: E0319 11:59:33.603397 17644 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.603643 master-0 kubenswrapper[17644]: E0319 11:59:33.603628 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b5c7eb66-e23e-40df-883c-fed012c07f26-images podName:b5c7eb66-e23e-40df-883c-fed012c07f26 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.103593196 +0000 UTC m=+7.873551231 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/b5c7eb66-e23e-40df-883c-fed012c07f26-images") pod "machine-config-operator-84d549f6d5-66wvv" (UID: "b5c7eb66-e23e-40df-883c-fed012c07f26") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.603753 master-0 kubenswrapper[17644]: E0319 11:59:33.603635 17644 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.603872 master-0 kubenswrapper[17644]: E0319 11:59:33.603644 17644 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.603998 master-0 kubenswrapper[17644]: E0319 11:59:33.603981 17644 configmap.go:193] Couldn't get configMap openshift-insights/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.604113 master-0 kubenswrapper[17644]: E0319 11:59:33.603664 17644 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.604235 master-0 kubenswrapper[17644]: E0319 11:59:33.604207 17644 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.604288 master-0 kubenswrapper[17644]: E0319 11:59:33.603659 17644 configmap.go:193] Couldn't get configMap openshift-monitoring/kube-state-metrics-custom-resource-state-configmap: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.604321 master-0 kubenswrapper[17644]: E0319 11:59:33.603746 17644 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.604321 master-0 kubenswrapper[17644]: E0319 11:59:33.603771 17644 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.604375 master-0 kubenswrapper[17644]: E0319 11:59:33.603852 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-metrics-client-ca podName:2d63d5a8-f45d-4678-824d-5534b2bcd6ca nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.103838933 +0000 UTC m=+7.873796968 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-metrics-client-ca") pod "kube-state-metrics-7bbc969446-xkg9f" (UID: "2d63d5a8-f45d-4678-824d-5534b2bcd6ca") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.604375 master-0 kubenswrapper[17644]: E0319 11:59:33.604373 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-kube-rbac-proxy-config podName:d06b230b-db67-4afc-8d10-2c33ad568462 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.104361085 +0000 UTC m=+7.874319320 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-kube-rbac-proxy-config") pod "node-exporter-pnb9m" (UID: "d06b230b-db67-4afc-8d10-2c33ad568462") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.604449 master-0 kubenswrapper[17644]: E0319 11:59:33.604039 17644 secret.go:189] Couldn't get secret openshift-monitoring/metrics-client-certs: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.604449 master-0 kubenswrapper[17644]: E0319 11:59:33.604403 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/034cad93-a500-4c58-8d97-fa49866a0d5e-trusted-ca-bundle podName:034cad93-a500-4c58-8d97-fa49866a0d5e nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.104391646 +0000 UTC m=+7.874349871 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/034cad93-a500-4c58-8d97-fa49866a0d5e-trusted-ca-bundle") pod "insights-operator-68bf6ff9d6-djfg8" (UID: "034cad93-a500-4c58-8d97-fa49866a0d5e") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.604449 master-0 kubenswrapper[17644]: E0319 11:59:33.604431 17644 secret.go:189] Couldn't get secret openshift-cluster-storage-operator/cluster-storage-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.604552 master-0 kubenswrapper[17644]: E0319 11:59:33.604455 17644 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.604552 master-0 kubenswrapper[17644]: E0319 11:59:33.604431 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-kube-rbac-proxy-config podName:2d63d5a8-f45d-4678-824d-5534b2bcd6ca nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.104420256 +0000 UTC m=+7.874378531 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-kube-rbac-proxy-config") pod "kube-state-metrics-7bbc969446-xkg9f" (UID: "2d63d5a8-f45d-4678-824d-5534b2bcd6ca") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.604552 master-0 kubenswrapper[17644]: E0319 11:59:33.604505 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/12809811-c9df-4e77-8c12-309831b8975d-mcc-auth-proxy-config podName:12809811-c9df-4e77-8c12-309831b8975d nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.104493208 +0000 UTC m=+7.874451243 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "mcc-auth-proxy-config" (UniqueName: "kubernetes.io/configmap/12809811-c9df-4e77-8c12-309831b8975d-mcc-auth-proxy-config") pod "machine-config-controller-b4f87c5b9-lg6h9" (UID: "12809811-c9df-4e77-8c12-309831b8975d") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.604552 master-0 kubenswrapper[17644]: E0319 11:59:33.604525 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-custom-resource-state-configmap podName:2d63d5a8-f45d-4678-824d-5534b2bcd6ca nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.104516249 +0000 UTC m=+7.874474284 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-custom-resource-state-configmap" (UniqueName: "kubernetes.io/configmap/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-custom-resource-state-configmap") pod "kube-state-metrics-7bbc969446-xkg9f" (UID: "2d63d5a8-f45d-4678-824d-5534b2bcd6ca") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.604552 master-0 kubenswrapper[17644]: E0319 11:59:33.604548 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5c7eb66-e23e-40df-883c-fed012c07f26-proxy-tls podName:b5c7eb66-e23e-40df-883c-fed012c07f26 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.104538799 +0000 UTC m=+7.874496834 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b5c7eb66-e23e-40df-883c-fed012c07f26-proxy-tls") pod "machine-config-operator-84d549f6d5-66wvv" (UID: "b5c7eb66-e23e-40df-883c-fed012c07f26") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.604710 master-0 kubenswrapper[17644]: E0319 11:59:33.604563 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4aad0ff-e6cd-4c43-9561-80a14fee4712-prometheus-operator-kube-rbac-proxy-config podName:f4aad0ff-e6cd-4c43-9561-80a14fee4712 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.10455657 +0000 UTC m=+7.874514605 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/f4aad0ff-e6cd-4c43-9561-80a14fee4712-prometheus-operator-kube-rbac-proxy-config") pod "prometheus-operator-6c8df6d4b-xfwkr" (UID: "f4aad0ff-e6cd-4c43-9561-80a14fee4712") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.604710 master-0 kubenswrapper[17644]: E0319 11:59:33.604585 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-client-certs podName:5f8c022c-7871-4765-971f-dcafa39357c9 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.10457609 +0000 UTC m=+7.874534115 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-metrics-client-certs" (UniqueName: "kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-client-certs") pod "metrics-server-64d6dd6b7b-xdrz5" (UID: "5f8c022c-7871-4765-971f-dcafa39357c9") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.604710 master-0 kubenswrapper[17644]: E0319 11:59:33.604605 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52bdf7cc-f07d-487e-937c-6567f194947e-cluster-storage-operator-serving-cert podName:52bdf7cc-f07d-487e-937c-6567f194947e nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.104601081 +0000 UTC m=+7.874559116 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-storage-operator-serving-cert" (UniqueName: "kubernetes.io/secret/52bdf7cc-f07d-487e-937c-6567f194947e-cluster-storage-operator-serving-cert") pod "cluster-storage-operator-7d87854d6-htdhf" (UID: "52bdf7cc-f07d-487e-937c-6567f194947e") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.604710 master-0 kubenswrapper[17644]: E0319 11:59:33.604620 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-tls podName:d06b230b-db67-4afc-8d10-2c33ad568462 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.104613371 +0000 UTC m=+7.874571396 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-tls") pod "node-exporter-pnb9m" (UID: "d06b230b-db67-4afc-8d10-2c33ad568462") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.604902 master-0 kubenswrapper[17644]: E0319 11:59:33.604887 17644 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.604996 master-0 kubenswrapper[17644]: E0319 11:59:33.604985 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-server-tls podName:5f8c022c-7871-4765-971f-dcafa39357c9 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.10497522 +0000 UTC m=+7.874933255 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-metrics-server-tls" (UniqueName: "kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-server-tls") pod "metrics-server-64d6dd6b7b-xdrz5" (UID: "5f8c022c-7871-4765-971f-dcafa39357c9") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.611970 master-0 kubenswrapper[17644]: E0319 11:59:33.611934 17644 configmap.go:193] Couldn't get configMap openshift-machine-api/baremetal-kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.611970 master-0 kubenswrapper[17644]: E0319 11:59:33.612005 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/92e401a4-ed2f-46f7-924b-329d7b313e6a-config podName:92e401a4-ed2f-46f7-924b-329d7b313e6a nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.11198844 +0000 UTC m=+7.881946685 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/92e401a4-ed2f-46f7-924b-329d7b313e6a-config") pod "cluster-baremetal-operator-6f69995874-942g6" (UID: "92e401a4-ed2f-46f7-924b-329d7b313e6a") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.612602 master-0 kubenswrapper[17644]: E0319 11:59:33.612576 17644 configmap.go:193] Couldn't get configMap openshift-machine-api/cluster-baremetal-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.612769 master-0 kubenswrapper[17644]: E0319 11:59:33.612719 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/92e401a4-ed2f-46f7-924b-329d7b313e6a-images podName:92e401a4-ed2f-46f7-924b-329d7b313e6a nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.112700788 +0000 UTC m=+7.882658823 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/92e401a4-ed2f-46f7-924b-329d7b313e6a-images") pod "cluster-baremetal-operator-6f69995874-942g6" (UID: "92e401a4-ed2f-46f7-924b-329d7b313e6a") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.613365 master-0 kubenswrapper[17644]: E0319 11:59:33.613327 17644 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.613427 master-0 kubenswrapper[17644]: E0319 11:59:33.613404 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-config podName:e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.113388224 +0000 UTC m=+7.883346249 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-config") pod "route-controller-manager-864f875b6b-rcjvd" (UID: "e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.613602 master-0 kubenswrapper[17644]: E0319 11:59:33.613569 17644 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.613602 master-0 kubenswrapper[17644]: E0319 11:59:33.613584 17644 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.613701 master-0 kubenswrapper[17644]: E0319 11:59:33.613622 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92e401a4-ed2f-46f7-924b-329d7b313e6a-cert podName:92e401a4-ed2f-46f7-924b-329d7b313e6a nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.11361147 +0000 UTC m=+7.883569715 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92e401a4-ed2f-46f7-924b-329d7b313e6a-cert") pod "cluster-baremetal-operator-6f69995874-942g6" (UID: "92e401a4-ed2f-46f7-924b-329d7b313e6a") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.613701 master-0 kubenswrapper[17644]: E0319 11:59:33.613642 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-openshift-state-metrics-tls podName:dedf55c4-eeda-4955-aafe-db1fdb8c4a48 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.113633581 +0000 UTC m=+7.883591616 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-openshift-state-metrics-tls") pod "openshift-state-metrics-5dc6c74576-lwqmn" (UID: "dedf55c4-eeda-4955-aafe-db1fdb8c4a48") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.613836 master-0 kubenswrapper[17644]: E0319 11:59:33.613719 17644 secret.go:189] Couldn't get secret openshift-machine-config-operator/mcc-proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.613836 master-0 kubenswrapper[17644]: E0319 11:59:33.613773 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12809811-c9df-4e77-8c12-309831b8975d-proxy-tls podName:12809811-c9df-4e77-8c12-309831b8975d nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.113765484 +0000 UTC m=+7.883723519 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/12809811-c9df-4e77-8c12-309831b8975d-proxy-tls") pod "machine-config-controller-b4f87c5b9-lg6h9" (UID: "12809811-c9df-4e77-8c12-309831b8975d") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.614051 master-0 kubenswrapper[17644]: E0319 11:59:33.614031 17644 secret.go:189] Couldn't get secret openshift-cloud-controller-manager-operator/cloud-controller-manager-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.614188 master-0 kubenswrapper[17644]: E0319 11:59:33.614162 17644 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.614236 master-0 kubenswrapper[17644]: E0319 11:59:33.614159 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-cloud-controller-manager-operator-tls podName:c8d8a09f-22d5-4f16-84d6-d5f2c504c949 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.114144803 +0000 UTC m=+7.884102828 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-controller-manager-operator-tls" (UniqueName: "kubernetes.io/secret/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-cloud-controller-manager-operator-tls") pod "cluster-cloud-controller-manager-operator-7dff898856-87z86" (UID: "c8d8a09f-22d5-4f16-84d6-d5f2c504c949") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.614236 master-0 kubenswrapper[17644]: E0319 11:59:33.614221 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-client-ca podName:76cf2b01-33d9-47eb-be5d-44946c78bf20 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.114212924 +0000 UTC m=+7.884170959 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-client-ca") pod "controller-manager-548bb99f44-txbjj" (UID: "76cf2b01-33d9-47eb-be5d-44946c78bf20") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.614236 master-0 kubenswrapper[17644]: E0319 11:59:33.614073 17644 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.614328 master-0 kubenswrapper[17644]: E0319 11:59:33.614250 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d41245b-33d4-40f8-bbe1-6d2247e2e335-apiservice-cert podName:6d41245b-33d4-40f8-bbe1-6d2247e2e335 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.114243895 +0000 UTC m=+7.884201930 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/6d41245b-33d4-40f8-bbe1-6d2247e2e335-apiservice-cert") pod "packageserver-bbf67c86c-n58nq" (UID: "6d41245b-33d4-40f8-bbe1-6d2247e2e335") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.614328 master-0 kubenswrapper[17644]: I0319 11:59:33.614057 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 11:59:33.614451 master-0 kubenswrapper[17644]: E0319 11:59:33.614416 17644 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.614524 master-0 kubenswrapper[17644]: E0319 11:59:33.614497 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-proxy-ca-bundles podName:76cf2b01-33d9-47eb-be5d-44946c78bf20 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.114481081 +0000 UTC m=+7.884439326 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-proxy-ca-bundles") pod "controller-manager-548bb99f44-txbjj" (UID: "76cf2b01-33d9-47eb-be5d-44946c78bf20") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.614599 master-0 kubenswrapper[17644]: E0319 11:59:33.614432 17644 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.614785 master-0 kubenswrapper[17644]: E0319 11:59:33.614709 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-images podName:75aedbcd-f6ed-43a1-941b-4b04887ffe8e nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.114696566 +0000 UTC m=+7.884654851 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-images") pod "machine-api-operator-6fbb6cf6f9-jf7p6" (UID: "75aedbcd-f6ed-43a1-941b-4b04887ffe8e") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.614876 master-0 kubenswrapper[17644]: E0319 11:59:33.614851 17644 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.614940 master-0 kubenswrapper[17644]: E0319 11:59:33.614901 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-config podName:75aedbcd-f6ed-43a1-941b-4b04887ffe8e nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.114891741 +0000 UTC m=+7.884849776 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-config") pod "machine-api-operator-6fbb6cf6f9-jf7p6" (UID: "75aedbcd-f6ed-43a1-941b-4b04887ffe8e") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.615002 master-0 kubenswrapper[17644]: E0319 11:59:33.614817 17644 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.615130 master-0 kubenswrapper[17644]: E0319 11:59:33.615116 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-config podName:76cf2b01-33d9-47eb-be5d-44946c78bf20 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.115100786 +0000 UTC m=+7.885059041 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-config") pod "controller-manager-548bb99f44-txbjj" (UID: "76cf2b01-33d9-47eb-be5d-44946c78bf20") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.618089 master-0 kubenswrapper[17644]: E0319 11:59:33.617878 17644 configmap.go:193] Couldn't get configMap openshift-cloud-credential-operator/cco-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.618089 master-0 kubenswrapper[17644]: E0319 11:59:33.617936 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab-cco-trusted-ca podName:0cbbe8d0-aafb-499f-a1f4-affcea62c1ab nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.117925655 +0000 UTC m=+7.887883690 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cco-trusted-ca" (UniqueName: "kubernetes.io/configmap/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab-cco-trusted-ca") pod "cloud-credential-operator-744f9dbf77-mhvls" (UID: "0cbbe8d0-aafb-499f-a1f4-affcea62c1ab") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.618089 master-0 kubenswrapper[17644]: E0319 11:59:33.617971 17644 secret.go:189] Couldn't get secret openshift-machine-config-operator/proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.618089 master-0 kubenswrapper[17644]: E0319 11:59:33.618003 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24f71770-714e-4111-9188-ad8663c6baa7-proxy-tls podName:24f71770-714e-4111-9188-ad8663c6baa7 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.117997326 +0000 UTC m=+7.887955351 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/24f71770-714e-4111-9188-ad8663c6baa7-proxy-tls") pod "machine-config-daemon-mgzld" (UID: "24f71770-714e-4111-9188-ad8663c6baa7") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.619221 master-0 kubenswrapper[17644]: E0319 11:59:33.619097 17644 secret.go:189] Couldn't get secret openshift-cloud-credential-operator/cloud-credential-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.619221 master-0 kubenswrapper[17644]: E0319 11:59:33.619152 17644 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.619221 master-0 kubenswrapper[17644]: E0319 11:59:33.619118 17644 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.619221 master-0 kubenswrapper[17644]: E0319 11:59:33.619229 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-auth-proxy-config podName:a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.119206706 +0000 UTC m=+7.889164861 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-auth-proxy-config") pod "machine-approver-5c6485487f-5zvc5" (UID: "a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.619582 master-0 kubenswrapper[17644]: E0319 11:59:33.619258 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab-cloud-credential-operator-serving-cert podName:0cbbe8d0-aafb-499f-a1f4-affcea62c1ab nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.119240417 +0000 UTC m=+7.889198702 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-credential-operator-serving-cert" (UniqueName: "kubernetes.io/secret/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab-cloud-credential-operator-serving-cert") pod "cloud-credential-operator-744f9dbf77-mhvls" (UID: "0cbbe8d0-aafb-499f-a1f4-affcea62c1ab") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.619582 master-0 kubenswrapper[17644]: E0319 11:59:33.619276 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-openshift-state-metrics-kube-rbac-proxy-config podName:dedf55c4-eeda-4955-aafe-db1fdb8c4a48 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.119267967 +0000 UTC m=+7.889226252 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-openshift-state-metrics-kube-rbac-proxy-config") pod "openshift-state-metrics-5dc6c74576-lwqmn" (UID: "dedf55c4-eeda-4955-aafe-db1fdb8c4a48") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.619582 master-0 kubenswrapper[17644]: E0319 11:59:33.619294 17644 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.619582 master-0 kubenswrapper[17644]: E0319 11:59:33.619327 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d41245b-33d4-40f8-bbe1-6d2247e2e335-webhook-cert podName:6d41245b-33d4-40f8-bbe1-6d2247e2e335 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.119318029 +0000 UTC m=+7.889276304 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/6d41245b-33d4-40f8-bbe1-6d2247e2e335-webhook-cert") pod "packageserver-bbf67c86c-n58nq" (UID: "6d41245b-33d4-40f8-bbe1-6d2247e2e335") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.619582 master-0 kubenswrapper[17644]: E0319 11:59:33.619360 17644 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/machine-approver-config: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.619582 master-0 kubenswrapper[17644]: E0319 11:59:33.619388 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-config podName:a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.11937938 +0000 UTC m=+7.889337665 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-config") pod "machine-approver-5c6485487f-5zvc5" (UID: "a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.619582 master-0 kubenswrapper[17644]: E0319 11:59:33.619393 17644 secret.go:189] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.619582 master-0 kubenswrapper[17644]: E0319 11:59:33.619417 17644 secret.go:189] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.619582 master-0 kubenswrapper[17644]: E0319 11:59:33.619434 17644 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.619582 master-0 kubenswrapper[17644]: E0319 11:59:33.619469 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-machine-api-operator-tls podName:75aedbcd-f6ed-43a1-941b-4b04887ffe8e nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.119452642 +0000 UTC m=+7.889410897 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-machine-api-operator-tls") pod "machine-api-operator-6fbb6cf6f9-jf7p6" (UID: "75aedbcd-f6ed-43a1-941b-4b04887ffe8e") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.619582 master-0 kubenswrapper[17644]: E0319 11:59:33.619491 17644 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/cloud-controller-manager-images: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.619582 master-0 kubenswrapper[17644]: E0319 11:59:33.619499 17644 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.619582 master-0 kubenswrapper[17644]: E0319 11:59:33.619511 17644 secret.go:189] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.619582 master-0 kubenswrapper[17644]: E0319 11:59:33.619580 17644 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.619582 master-0 kubenswrapper[17644]: E0319 11:59:33.619596 17644 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: E0319 11:59:33.619610 17644 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: E0319 11:59:33.619497 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b5c7eb66-e23e-40df-883c-fed012c07f26-auth-proxy-config podName:b5c7eb66-e23e-40df-883c-fed012c07f26 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.119486603 +0000 UTC m=+7.889444888 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/b5c7eb66-e23e-40df-883c-fed012c07f26-auth-proxy-config") pod "machine-config-operator-84d549f6d5-66wvv" (UID: "b5c7eb66-e23e-40df-883c-fed012c07f26") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: E0319 11:59:33.619648 17644 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-server-audit-profiles: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: E0319 11:59:33.619661 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6870ccc7-2094-48d8-9238-f486a4b8d5af-certs podName:6870ccc7-2094-48d8-9238-f486a4b8d5af nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.119649696 +0000 UTC m=+7.889607981 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/6870ccc7-2094-48d8-9238-f486a4b8d5af-certs") pod "machine-config-server-ltk8s" (UID: "6870ccc7-2094-48d8-9238-f486a4b8d5af") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: E0319 11:59:33.619582 17644 secret.go:189] Couldn't get secret openshift-cluster-machine-approver/machine-approver-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: E0319 11:59:33.619669 17644 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: E0319 11:59:33.619688 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4aad0ff-e6cd-4c43-9561-80a14fee4712-prometheus-operator-tls podName:f4aad0ff-e6cd-4c43-9561-80a14fee4712 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.119670987 +0000 UTC m=+7.889629022 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/f4aad0ff-e6cd-4c43-9561-80a14fee4712-prometheus-operator-tls") pod "prometheus-operator-6c8df6d4b-xfwkr" (UID: "f4aad0ff-e6cd-4c43-9561-80a14fee4712") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: E0319 11:59:33.619522 17644 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy-cluster-autoscaler-operator: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: E0319 11:59:33.619706 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-machine-approver-tls podName:a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.119698268 +0000 UTC m=+7.889656583 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-approver-tls" (UniqueName: "kubernetes.io/secret/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-machine-approver-tls") pod "machine-approver-5c6485487f-5zvc5" (UID: "a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: E0319 11:59:33.619756 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-images podName:c8d8a09f-22d5-4f16-84d6-d5f2c504c949 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.119747849 +0000 UTC m=+7.889705894 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-images") pod "cluster-cloud-controller-manager-operator-7dff898856-87z86" (UID: "c8d8a09f-22d5-4f16-84d6-d5f2c504c949") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: E0319 11:59:33.619564 17644 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-6ro5itlgu7nag: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: E0319 11:59:33.619788 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/24f71770-714e-4111-9188-ad8663c6baa7-mcd-auth-proxy-config podName:24f71770-714e-4111-9188-ad8663c6baa7 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.119780269 +0000 UTC m=+7.889738555 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "mcd-auth-proxy-config" (UniqueName: "kubernetes.io/configmap/24f71770-714e-4111-9188-ad8663c6baa7-mcd-auth-proxy-config") pod "machine-config-daemon-mgzld" (UID: "24f71770-714e-4111-9188-ad8663c6baa7") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: E0319 11:59:33.619806 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6870ccc7-2094-48d8-9238-f486a4b8d5af-node-bootstrap-token podName:6870ccc7-2094-48d8-9238-f486a4b8d5af nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.11979781 +0000 UTC m=+7.889756105 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/6870ccc7-2094-48d8-9238-f486a4b8d5af-node-bootstrap-token") pod "machine-config-server-ltk8s" (UID: "6870ccc7-2094-48d8-9238-f486a4b8d5af") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: E0319 11:59:33.619822 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-tls podName:2d63d5a8-f45d-4678-824d-5534b2bcd6ca nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.11981488 +0000 UTC m=+7.889773175 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-tls") pod "kube-state-metrics-7bbc969446-xkg9f" (UID: "2d63d5a8-f45d-4678-824d-5534b2bcd6ca") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: E0319 11:59:33.619844 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92e401a4-ed2f-46f7-924b-329d7b313e6a-cluster-baremetal-operator-tls podName:92e401a4-ed2f-46f7-924b-329d7b313e6a nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.119831231 +0000 UTC m=+7.889789496 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/92e401a4-ed2f-46f7-924b-329d7b313e6a-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-6f69995874-942g6" (UID: "92e401a4-ed2f-46f7-924b-329d7b313e6a") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: E0319 11:59:33.619621 17644 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: E0319 11:59:33.619862 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-metrics-server-audit-profiles podName:5f8c022c-7871-4765-971f-dcafa39357c9 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.119854331 +0000 UTC m=+7.889812626 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-server-audit-profiles" (UniqueName: "kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-metrics-server-audit-profiles") pod "metrics-server-64d6dd6b7b-xdrz5" (UID: "5f8c022c-7871-4765-971f-dcafa39357c9") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: E0319 11:59:33.619892 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-auth-proxy-config podName:c8d8a09f-22d5-4f16-84d6-d5f2c504c949 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.119883102 +0000 UTC m=+7.889841397 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-auth-proxy-config") pod "cluster-cloud-controller-manager-operator-7dff898856-87z86" (UID: "c8d8a09f-22d5-4f16-84d6-d5f2c504c949") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: E0319 11:59:33.619690 17644 configmap.go:193] Couldn't get configMap openshift-insights/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: E0319 11:59:33.619914 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ac09dba7-398c-4b0a-a415-edb73cb4cf30-auth-proxy-config podName:ac09dba7-398c-4b0a-a415-edb73cb4cf30 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.119903702 +0000 UTC m=+7.889861997 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/ac09dba7-398c-4b0a-a415-edb73cb4cf30-auth-proxy-config") pod "cluster-autoscaler-operator-866dc4744-dnx7f" (UID: "ac09dba7-398c-4b0a-a415-edb73cb4cf30") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: E0319 11:59:33.619942 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-client-ca-bundle podName:5f8c022c-7871-4765-971f-dcafa39357c9 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.119935063 +0000 UTC m=+7.889893358 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-client-ca-bundle") pod "metrics-server-64d6dd6b7b-xdrz5" (UID: "5f8c022c-7871-4765-971f-dcafa39357c9") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: E0319 11:59:33.619964 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4aad0ff-e6cd-4c43-9561-80a14fee4712-metrics-client-ca podName:f4aad0ff-e6cd-4c43-9561-80a14fee4712 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.119953304 +0000 UTC m=+7.889911599 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/f4aad0ff-e6cd-4c43-9561-80a14fee4712-metrics-client-ca") pod "prometheus-operator-6c8df6d4b-xfwkr" (UID: "f4aad0ff-e6cd-4c43-9561-80a14fee4712") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: I0319 11:59:33.619952 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd6ec279-d92f-45c2-97c2-88b96fbd6600-service-ca\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: E0319 11:59:33.619991 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/034cad93-a500-4c58-8d97-fa49866a0d5e-service-ca-bundle podName:034cad93-a500-4c58-8d97-fa49866a0d5e nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.119983074 +0000 UTC m=+7.889941379 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/034cad93-a500-4c58-8d97-fa49866a0d5e-service-ca-bundle") pod "insights-operator-68bf6ff9d6-djfg8" (UID: "034cad93-a500-4c58-8d97-fa49866a0d5e") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: E0319 11:59:33.620887 17644 configmap.go:193] Couldn't get configMap openshift-monitoring/kubelet-serving-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.621754 master-0 kubenswrapper[17644]: E0319 11:59:33.620963 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-configmap-kubelet-serving-ca-bundle podName:5f8c022c-7871-4765-971f-dcafa39357c9 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:34.120948218 +0000 UTC m=+7.890906473 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "configmap-kubelet-serving-ca-bundle" (UniqueName: "kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-configmap-kubelet-serving-ca-bundle") pod "metrics-server-64d6dd6b7b-xdrz5" (UID: "5f8c022c-7871-4765-971f-dcafa39357c9") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:33.633708 master-0 kubenswrapper[17644]: I0319 11:59:33.633654 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 19 11:59:33.654107 master-0 kubenswrapper[17644]: I0319 11:59:33.654042 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-dfb6p" Mar 19 11:59:33.674544 master-0 kubenswrapper[17644]: I0319 11:59:33.674490 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 19 11:59:33.693514 master-0 kubenswrapper[17644]: I0319 11:59:33.693374 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 19 11:59:33.713085 master-0 kubenswrapper[17644]: I0319 11:59:33.713006 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 19 11:59:33.735309 master-0 kubenswrapper[17644]: I0319 11:59:33.733862 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 19 11:59:33.754558 master-0 kubenswrapper[17644]: I0319 11:59:33.754467 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-8h56m" Mar 19 11:59:33.773508 master-0 kubenswrapper[17644]: I0319 11:59:33.773426 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 11:59:33.794078 master-0 kubenswrapper[17644]: I0319 11:59:33.793980 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 19 11:59:33.813712 master-0 kubenswrapper[17644]: I0319 11:59:33.813639 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 11:59:33.833827 master-0 kubenswrapper[17644]: I0319 11:59:33.833712 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 11:59:33.853326 master-0 kubenswrapper[17644]: I0319 11:59:33.853267 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 11:59:33.875313 master-0 kubenswrapper[17644]: I0319 11:59:33.875260 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 11:59:33.894509 master-0 kubenswrapper[17644]: I0319 11:59:33.894426 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 11:59:33.914327 master-0 kubenswrapper[17644]: I0319 11:59:33.914253 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 11:59:33.935153 master-0 kubenswrapper[17644]: I0319 11:59:33.935041 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 11:59:33.954415 master-0 kubenswrapper[17644]: I0319 11:59:33.954229 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 11:59:33.981211 master-0 kubenswrapper[17644]: I0319 11:59:33.981139 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 11:59:33.993293 master-0 kubenswrapper[17644]: I0319 11:59:33.993229 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 11:59:34.015263 master-0 kubenswrapper[17644]: I0319 11:59:34.015153 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 11:59:34.033467 master-0 kubenswrapper[17644]: I0319 11:59:34.033361 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 11:59:34.054568 master-0 kubenswrapper[17644]: I0319 11:59:34.054445 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 19 11:59:34.073684 master-0 kubenswrapper[17644]: I0319 11:59:34.073589 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-pdvk4" Mar 19 11:59:34.094607 master-0 kubenswrapper[17644]: I0319 11:59:34.094513 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 19 11:59:34.115195 master-0 kubenswrapper[17644]: I0319 11:59:34.115108 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 19 11:59:34.134584 master-0 kubenswrapper[17644]: I0319 11:59:34.134524 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 11:59:34.154026 master-0 kubenswrapper[17644]: I0319 11:59:34.153937 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-g72px" Mar 19 11:59:34.159011 master-0 kubenswrapper[17644]: I0319 11:59:34.158938 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b5c7eb66-e23e-40df-883c-fed012c07f26-images\") pod \"machine-config-operator-84d549f6d5-66wvv\" (UID: \"b5c7eb66-e23e-40df-883c-fed012c07f26\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" Mar 19 11:59:34.159011 master-0 kubenswrapper[17644]: I0319 11:59:34.159012 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-client-ca\") pod \"route-controller-manager-864f875b6b-rcjvd\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:59:34.159237 master-0 kubenswrapper[17644]: I0319 11:59:34.159042 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-config\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:59:34.159237 master-0 kubenswrapper[17644]: I0319 11:59:34.159103 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-client-ca\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:59:34.159237 master-0 kubenswrapper[17644]: I0319 11:59:34.159131 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/24f71770-714e-4111-9188-ad8663c6baa7-mcd-auth-proxy-config\") pod \"machine-config-daemon-mgzld\" (UID: \"24f71770-714e-4111-9188-ad8663c6baa7\") " pod="openshift-machine-config-operator/machine-config-daemon-mgzld" Mar 19 11:59:34.159237 master-0 kubenswrapper[17644]: I0319 11:59:34.159180 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/034cad93-a500-4c58-8d97-fa49866a0d5e-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:59:34.159420 master-0 kubenswrapper[17644]: I0319 11:59:34.159275 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/92e401a4-ed2f-46f7-924b-329d7b313e6a-images\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:59:34.159420 master-0 kubenswrapper[17644]: I0319 11:59:34.159312 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/12809811-c9df-4e77-8c12-309831b8975d-mcc-auth-proxy-config\") pod \"machine-config-controller-b4f87c5b9-lg6h9\" (UID: \"12809811-c9df-4e77-8c12-309831b8975d\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9" Mar 19 11:59:34.159420 master-0 kubenswrapper[17644]: I0319 11:59:34.159344 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/034cad93-a500-4c58-8d97-fa49866a0d5e-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:59:34.159420 master-0 kubenswrapper[17644]: I0319 11:59:34.159372 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-config\") pod \"machine-approver-5c6485487f-5zvc5\" (UID: \"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" Mar 19 11:59:34.159420 master-0 kubenswrapper[17644]: I0319 11:59:34.159400 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-metrics-server-audit-profiles\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:34.159597 master-0 kubenswrapper[17644]: I0319 11:59:34.159444 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d41245b-33d4-40f8-bbe1-6d2247e2e335-webhook-cert\") pod \"packageserver-bbf67c86c-n58nq\" (UID: \"6d41245b-33d4-40f8-bbe1-6d2247e2e335\") " pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:59:34.159597 master-0 kubenswrapper[17644]: I0319 11:59:34.159476 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ac09dba7-398c-4b0a-a415-edb73cb4cf30-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-dnx7f\" (UID: \"ac09dba7-398c-4b0a-a415-edb73cb4cf30\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f" Mar 19 11:59:34.159597 master-0 kubenswrapper[17644]: I0319 11:59:34.159516 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/92e401a4-ed2f-46f7-924b-329d7b313e6a-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:59:34.159597 master-0 kubenswrapper[17644]: I0319 11:59:34.159548 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-mhvls\" (UID: \"0cbbe8d0-aafb-499f-a1f4-affcea62c1ab\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls" Mar 19 11:59:34.159717 master-0 kubenswrapper[17644]: I0319 11:59:34.159601 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-auth-proxy-config\") pod \"machine-approver-5c6485487f-5zvc5\" (UID: \"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" Mar 19 11:59:34.159717 master-0 kubenswrapper[17644]: I0319 11:59:34.159630 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-machine-approver-tls\") pod \"machine-approver-5c6485487f-5zvc5\" (UID: \"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" Mar 19 11:59:34.159717 master-0 kubenswrapper[17644]: I0319 11:59:34.159664 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f4aad0ff-e6cd-4c43-9561-80a14fee4712-metrics-client-ca\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:59:34.159717 master-0 kubenswrapper[17644]: I0319 11:59:34.159684 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4aad0ff-e6cd-4c43-9561-80a14fee4712-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:59:34.159878 master-0 kubenswrapper[17644]: I0319 11:59:34.159758 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-mhvls\" (UID: \"0cbbe8d0-aafb-499f-a1f4-affcea62c1ab\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls" Mar 19 11:59:34.159878 master-0 kubenswrapper[17644]: I0319 11:59:34.159844 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:59:34.159878 master-0 kubenswrapper[17644]: I0319 11:59:34.159864 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:59:34.159964 master-0 kubenswrapper[17644]: I0319 11:59:34.159908 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b5c7eb66-e23e-40df-883c-fed012c07f26-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-66wvv\" (UID: \"b5c7eb66-e23e-40df-883c-fed012c07f26\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" Mar 19 11:59:34.159964 master-0 kubenswrapper[17644]: I0319 11:59:34.159951 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:59:34.160024 master-0 kubenswrapper[17644]: I0319 11:59:34.159997 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:34.160024 master-0 kubenswrapper[17644]: I0319 11:59:34.160019 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6870ccc7-2094-48d8-9238-f486a4b8d5af-node-bootstrap-token\") pod \"machine-config-server-ltk8s\" (UID: \"6870ccc7-2094-48d8-9238-f486a4b8d5af\") " pod="openshift-machine-config-operator/machine-config-server-ltk8s" Mar 19 11:59:34.160121 master-0 kubenswrapper[17644]: I0319 11:59:34.160051 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-client-ca-bundle\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:34.160121 master-0 kubenswrapper[17644]: I0319 11:59:34.160078 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:59:34.160121 master-0 kubenswrapper[17644]: I0319 11:59:34.160096 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac09dba7-398c-4b0a-a415-edb73cb4cf30-cert\") pod \"cluster-autoscaler-operator-866dc4744-dnx7f\" (UID: \"ac09dba7-398c-4b0a-a415-edb73cb4cf30\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f" Mar 19 11:59:34.160121 master-0 kubenswrapper[17644]: I0319 11:59:34.160114 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-metrics-client-ca\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:59:34.160257 master-0 kubenswrapper[17644]: I0319 11:59:34.160136 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb22a965-9b36-40cd-993d-747a3978be8e-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-ssxxd\" (UID: \"bb22a965-9b36-40cd-993d-747a3978be8e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-ssxxd" Mar 19 11:59:34.160257 master-0 kubenswrapper[17644]: I0319 11:59:34.160176 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d06b230b-db67-4afc-8d10-2c33ad568462-metrics-client-ca\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:59:34.160257 master-0 kubenswrapper[17644]: I0319 11:59:34.160215 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034cad93-a500-4c58-8d97-fa49866a0d5e-serving-cert\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:59:34.160257 master-0 kubenswrapper[17644]: I0319 11:59:34.160232 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12809811-c9df-4e77-8c12-309831b8975d-proxy-tls\") pod \"machine-config-controller-b4f87c5b9-lg6h9\" (UID: \"12809811-c9df-4e77-8c12-309831b8975d\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9" Mar 19 11:59:34.160372 master-0 kubenswrapper[17644]: I0319 11:59:34.160272 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-jf7p6\" (UID: \"75aedbcd-f6ed-43a1-941b-4b04887ffe8e\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" Mar 19 11:59:34.160372 master-0 kubenswrapper[17644]: I0319 11:59:34.160301 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e401a4-ed2f-46f7-924b-329d7b313e6a-config\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:59:34.160372 master-0 kubenswrapper[17644]: I0319 11:59:34.160332 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:59:34.160372 master-0 kubenswrapper[17644]: I0319 11:59:34.160351 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92e401a4-ed2f-46f7-924b-329d7b313e6a-cert\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:59:34.160516 master-0 kubenswrapper[17644]: I0319 11:59:34.160389 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f4aad0ff-e6cd-4c43-9561-80a14fee4712-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:59:34.160516 master-0 kubenswrapper[17644]: I0319 11:59:34.160416 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d41245b-33d4-40f8-bbe1-6d2247e2e335-apiservice-cert\") pod \"packageserver-bbf67c86c-n58nq\" (UID: \"6d41245b-33d4-40f8-bbe1-6d2247e2e335\") " pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:59:34.160516 master-0 kubenswrapper[17644]: I0319 11:59:34.160452 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-server-tls\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:34.160516 master-0 kubenswrapper[17644]: I0319 11:59:34.160486 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:59:34.160516 master-0 kubenswrapper[17644]: I0319 11:59:34.160508 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:59:34.160745 master-0 kubenswrapper[17644]: I0319 11:59:34.160599 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5c7eb66-e23e-40df-883c-fed012c07f26-proxy-tls\") pod \"machine-config-operator-84d549f6d5-66wvv\" (UID: \"b5c7eb66-e23e-40df-883c-fed012c07f26\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" Mar 19 11:59:34.160745 master-0 kubenswrapper[17644]: I0319 11:59:34.160625 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-metrics-client-ca\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:59:34.160745 master-0 kubenswrapper[17644]: I0319 11:59:34.160685 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-config\") pod \"route-controller-manager-864f875b6b-rcjvd\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:59:34.160745 master-0 kubenswrapper[17644]: I0319 11:59:34.160741 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-config\") pod \"machine-api-operator-6fbb6cf6f9-jf7p6\" (UID: \"75aedbcd-f6ed-43a1-941b-4b04887ffe8e\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" Mar 19 11:59:34.160924 master-0 kubenswrapper[17644]: I0319 11:59:34.160780 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-images\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:59:34.160924 master-0 kubenswrapper[17644]: I0319 11:59:34.160806 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-tls\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:59:34.160924 master-0 kubenswrapper[17644]: I0319 11:59:34.160838 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/52bdf7cc-f07d-487e-937c-6567f194947e-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-htdhf\" (UID: \"52bdf7cc-f07d-487e-937c-6567f194947e\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-htdhf" Mar 19 11:59:34.160924 master-0 kubenswrapper[17644]: I0319 11:59:34.160874 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:59:34.160924 master-0 kubenswrapper[17644]: I0319 11:59:34.160896 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-images\") pod \"machine-api-operator-6fbb6cf6f9-jf7p6\" (UID: \"75aedbcd-f6ed-43a1-941b-4b04887ffe8e\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" Mar 19 11:59:34.160924 master-0 kubenswrapper[17644]: I0319 11:59:34.160917 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6870ccc7-2094-48d8-9238-f486a4b8d5af-certs\") pod \"machine-config-server-ltk8s\" (UID: \"6870ccc7-2094-48d8-9238-f486a4b8d5af\") " pod="openshift-machine-config-operator/machine-config-server-ltk8s" Mar 19 11:59:34.161131 master-0 kubenswrapper[17644]: I0319 11:59:34.160940 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-proxy-ca-bundles\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:59:34.161131 master-0 kubenswrapper[17644]: I0319 11:59:34.160965 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-client-certs\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:34.161131 master-0 kubenswrapper[17644]: I0319 11:59:34.160996 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24f71770-714e-4111-9188-ad8663c6baa7-proxy-tls\") pod \"machine-config-daemon-mgzld\" (UID: \"24f71770-714e-4111-9188-ad8663c6baa7\") " pod="openshift-machine-config-operator/machine-config-daemon-mgzld" Mar 19 11:59:34.161483 master-0 kubenswrapper[17644]: I0319 11:59:34.161444 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-client-ca\") pod \"route-controller-manager-864f875b6b-rcjvd\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:59:34.161784 master-0 kubenswrapper[17644]: I0319 11:59:34.161755 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-config\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:59:34.161981 master-0 kubenswrapper[17644]: I0319 11:59:34.161930 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-client-ca\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:59:34.162142 master-0 kubenswrapper[17644]: I0319 11:59:34.162107 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/034cad93-a500-4c58-8d97-fa49866a0d5e-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:59:34.162424 master-0 kubenswrapper[17644]: I0319 11:59:34.162380 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/034cad93-a500-4c58-8d97-fa49866a0d5e-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:59:34.162753 master-0 kubenswrapper[17644]: I0319 11:59:34.162705 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d41245b-33d4-40f8-bbe1-6d2247e2e335-webhook-cert\") pod \"packageserver-bbf67c86c-n58nq\" (UID: \"6d41245b-33d4-40f8-bbe1-6d2247e2e335\") " pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:59:34.163029 master-0 kubenswrapper[17644]: I0319 11:59:34.162984 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-mhvls\" (UID: \"0cbbe8d0-aafb-499f-a1f4-affcea62c1ab\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls" Mar 19 11:59:34.163429 master-0 kubenswrapper[17644]: I0319 11:59:34.163391 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-mhvls\" (UID: \"0cbbe8d0-aafb-499f-a1f4-affcea62c1ab\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls" Mar 19 11:59:34.164153 master-0 kubenswrapper[17644]: I0319 11:59:34.164111 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034cad93-a500-4c58-8d97-fa49866a0d5e-serving-cert\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:59:34.164608 master-0 kubenswrapper[17644]: I0319 11:59:34.164553 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d41245b-33d4-40f8-bbe1-6d2247e2e335-apiservice-cert\") pod \"packageserver-bbf67c86c-n58nq\" (UID: \"6d41245b-33d4-40f8-bbe1-6d2247e2e335\") " pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:59:34.165077 master-0 kubenswrapper[17644]: I0319 11:59:34.165037 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-config\") pod \"route-controller-manager-864f875b6b-rcjvd\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:59:34.165970 master-0 kubenswrapper[17644]: I0319 11:59:34.165898 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-proxy-ca-bundles\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:59:34.175527 master-0 kubenswrapper[17644]: I0319 11:59:34.175450 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 11:59:34.184953 master-0 kubenswrapper[17644]: I0319 11:59:34.184883 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb22a965-9b36-40cd-993d-747a3978be8e-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-ssxxd\" (UID: \"bb22a965-9b36-40cd-993d-747a3978be8e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-ssxxd" Mar 19 11:59:34.186711 master-0 kubenswrapper[17644]: I0319 11:59:34.186664 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:59:34.194588 master-0 kubenswrapper[17644]: I0319 11:59:34.194526 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 19 11:59:34.203362 master-0 kubenswrapper[17644]: I0319 11:59:34.203309 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/92e401a4-ed2f-46f7-924b-329d7b313e6a-images\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:59:34.214169 master-0 kubenswrapper[17644]: I0319 11:59:34.214001 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 11:59:34.233437 master-0 kubenswrapper[17644]: I0319 11:59:34.233379 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-899lw" Mar 19 11:59:34.241922 master-0 kubenswrapper[17644]: E0319 11:59:34.241859 17644 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:59:34.254937 master-0 kubenswrapper[17644]: I0319 11:59:34.254868 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 19 11:59:34.256256 master-0 kubenswrapper[17644]: I0319 11:59:34.256216 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/52bdf7cc-f07d-487e-937c-6567f194947e-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-htdhf\" (UID: \"52bdf7cc-f07d-487e-937c-6567f194947e\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-htdhf" Mar 19 11:59:34.273648 master-0 kubenswrapper[17644]: I0319 11:59:34.273580 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-c95p8" Mar 19 11:59:34.293990 master-0 kubenswrapper[17644]: I0319 11:59:34.293931 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 19 11:59:34.295374 master-0 kubenswrapper[17644]: I0319 11:59:34.295338 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92e401a4-ed2f-46f7-924b-329d7b313e6a-cert\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:59:34.314059 master-0 kubenswrapper[17644]: I0319 11:59:34.313999 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 19 11:59:34.323622 master-0 kubenswrapper[17644]: I0319 11:59:34.323563 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/92e401a4-ed2f-46f7-924b-329d7b313e6a-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:59:34.333802 master-0 kubenswrapper[17644]: I0319 11:59:34.333532 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 11:59:34.355018 master-0 kubenswrapper[17644]: I0319 11:59:34.354963 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 19 11:59:34.365400 master-0 kubenswrapper[17644]: I0319 11:59:34.365340 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e401a4-ed2f-46f7-924b-329d7b313e6a-config\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:59:34.373272 master-0 kubenswrapper[17644]: I0319 11:59:34.373208 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-9hcb7" Mar 19 11:59:34.394613 master-0 kubenswrapper[17644]: I0319 11:59:34.394545 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 11:59:34.395348 master-0 kubenswrapper[17644]: I0319 11:59:34.395245 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5c7eb66-e23e-40df-883c-fed012c07f26-proxy-tls\") pod \"machine-config-operator-84d549f6d5-66wvv\" (UID: \"b5c7eb66-e23e-40df-883c-fed012c07f26\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" Mar 19 11:59:34.413613 master-0 kubenswrapper[17644]: I0319 11:59:34.413572 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 11:59:34.416018 master-0 kubenswrapper[17644]: I0319 11:59:34.415985 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-images\") pod \"machine-api-operator-6fbb6cf6f9-jf7p6\" (UID: \"75aedbcd-f6ed-43a1-941b-4b04887ffe8e\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" Mar 19 11:59:34.433905 master-0 kubenswrapper[17644]: I0319 11:59:34.433833 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 11:59:34.442027 master-0 kubenswrapper[17644]: I0319 11:59:34.441966 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b5c7eb66-e23e-40df-883c-fed012c07f26-images\") pod \"machine-config-operator-84d549f6d5-66wvv\" (UID: \"b5c7eb66-e23e-40df-883c-fed012c07f26\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" Mar 19 11:59:34.454938 master-0 kubenswrapper[17644]: I0319 11:59:34.454824 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 11:59:34.462801 master-0 kubenswrapper[17644]: I0319 11:59:34.462751 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/12809811-c9df-4e77-8c12-309831b8975d-mcc-auth-proxy-config\") pod \"machine-config-controller-b4f87c5b9-lg6h9\" (UID: \"12809811-c9df-4e77-8c12-309831b8975d\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9" Mar 19 11:59:34.463263 master-0 kubenswrapper[17644]: I0319 11:59:34.463224 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/24f71770-714e-4111-9188-ad8663c6baa7-mcd-auth-proxy-config\") pod \"machine-config-daemon-mgzld\" (UID: \"24f71770-714e-4111-9188-ad8663c6baa7\") " pod="openshift-machine-config-operator/machine-config-daemon-mgzld" Mar 19 11:59:34.463969 master-0 kubenswrapper[17644]: I0319 11:59:34.463938 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b5c7eb66-e23e-40df-883c-fed012c07f26-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-66wvv\" (UID: \"b5c7eb66-e23e-40df-883c-fed012c07f26\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" Mar 19 11:59:34.475081 master-0 kubenswrapper[17644]: I0319 11:59:34.474979 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 11:59:34.492097 master-0 kubenswrapper[17644]: I0319 11:59:34.492016 17644 request.go:700] Waited for 1.99242964s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/secrets?fieldSelector=metadata.name%3Dmachine-api-operator-dockercfg-6qwsh&limit=500&resourceVersion=0 Mar 19 11:59:34.493321 master-0 kubenswrapper[17644]: I0319 11:59:34.493289 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-6qwsh" Mar 19 11:59:34.514244 master-0 kubenswrapper[17644]: I0319 11:59:34.514165 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 11:59:34.514590 master-0 kubenswrapper[17644]: I0319 11:59:34.514522 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-jf7p6\" (UID: \"75aedbcd-f6ed-43a1-941b-4b04887ffe8e\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" Mar 19 11:59:34.533054 master-0 kubenswrapper[17644]: I0319 11:59:34.532997 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 11:59:34.540179 master-0 kubenswrapper[17644]: I0319 11:59:34.540140 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-config\") pod \"machine-api-operator-6fbb6cf6f9-jf7p6\" (UID: \"75aedbcd-f6ed-43a1-941b-4b04887ffe8e\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" Mar 19 11:59:34.554922 master-0 kubenswrapper[17644]: I0319 11:59:34.554873 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 11:59:34.555494 master-0 kubenswrapper[17644]: I0319 11:59:34.555470 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6870ccc7-2094-48d8-9238-f486a4b8d5af-certs\") pod \"machine-config-server-ltk8s\" (UID: \"6870ccc7-2094-48d8-9238-f486a4b8d5af\") " pod="openshift-machine-config-operator/machine-config-server-ltk8s" Mar 19 11:59:34.573385 master-0 kubenswrapper[17644]: I0319 11:59:34.573337 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 19 11:59:34.583327 master-0 kubenswrapper[17644]: I0319 11:59:34.583270 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ac09dba7-398c-4b0a-a415-edb73cb4cf30-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-dnx7f\" (UID: \"ac09dba7-398c-4b0a-a415-edb73cb4cf30\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f" Mar 19 11:59:34.594960 master-0 kubenswrapper[17644]: I0319 11:59:34.594899 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-dnwcp" Mar 19 11:59:34.613904 master-0 kubenswrapper[17644]: I0319 11:59:34.613833 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 11:59:34.633588 master-0 kubenswrapper[17644]: I0319 11:59:34.633545 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 11:59:34.653359 master-0 kubenswrapper[17644]: I0319 11:59:34.653298 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 11:59:34.654562 master-0 kubenswrapper[17644]: I0319 11:59:34.654520 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12809811-c9df-4e77-8c12-309831b8975d-proxy-tls\") pod \"machine-config-controller-b4f87c5b9-lg6h9\" (UID: \"12809811-c9df-4e77-8c12-309831b8975d\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9" Mar 19 11:59:34.674320 master-0 kubenswrapper[17644]: I0319 11:59:34.674225 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-5jj8d" Mar 19 11:59:34.694443 master-0 kubenswrapper[17644]: I0319 11:59:34.694371 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 11:59:34.705002 master-0 kubenswrapper[17644]: I0319 11:59:34.704941 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6870ccc7-2094-48d8-9238-f486a4b8d5af-node-bootstrap-token\") pod \"machine-config-server-ltk8s\" (UID: \"6870ccc7-2094-48d8-9238-f486a4b8d5af\") " pod="openshift-machine-config-operator/machine-config-server-ltk8s" Mar 19 11:59:34.713816 master-0 kubenswrapper[17644]: I0319 11:59:34.713766 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-75w76" Mar 19 11:59:34.734986 master-0 kubenswrapper[17644]: I0319 11:59:34.734866 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 19 11:59:34.744672 master-0 kubenswrapper[17644]: I0319 11:59:34.744629 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac09dba7-398c-4b0a-a415-edb73cb4cf30-cert\") pod \"cluster-autoscaler-operator-866dc4744-dnx7f\" (UID: \"ac09dba7-398c-4b0a-a415-edb73cb4cf30\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f" Mar 19 11:59:34.753246 master-0 kubenswrapper[17644]: I0319 11:59:34.753211 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 11:59:34.762167 master-0 kubenswrapper[17644]: I0319 11:59:34.762123 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24f71770-714e-4111-9188-ad8663c6baa7-proxy-tls\") pod \"machine-config-daemon-mgzld\" (UID: \"24f71770-714e-4111-9188-ad8663c6baa7\") " pod="openshift-machine-config-operator/machine-config-daemon-mgzld" Mar 19 11:59:34.773604 master-0 kubenswrapper[17644]: I0319 11:59:34.773506 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-2l456" Mar 19 11:59:34.794223 master-0 kubenswrapper[17644]: I0319 11:59:34.794161 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 11:59:34.803769 master-0 kubenswrapper[17644]: I0319 11:59:34.803681 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-machine-approver-tls\") pod \"machine-approver-5c6485487f-5zvc5\" (UID: \"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" Mar 19 11:59:34.814509 master-0 kubenswrapper[17644]: I0319 11:59:34.814439 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 11:59:34.824269 master-0 kubenswrapper[17644]: I0319 11:59:34.824205 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-auth-proxy-config\") pod \"machine-approver-5c6485487f-5zvc5\" (UID: \"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" Mar 19 11:59:34.836343 master-0 kubenswrapper[17644]: I0319 11:59:34.836271 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 11:59:34.843210 master-0 kubenswrapper[17644]: I0319 11:59:34.843152 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-config\") pod \"machine-approver-5c6485487f-5zvc5\" (UID: \"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" Mar 19 11:59:34.854556 master-0 kubenswrapper[17644]: I0319 11:59:34.854472 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 11:59:34.874706 master-0 kubenswrapper[17644]: I0319 11:59:34.874633 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-kn6lc" Mar 19 11:59:34.894843 master-0 kubenswrapper[17644]: I0319 11:59:34.894551 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 19 11:59:34.896030 master-0 kubenswrapper[17644]: I0319 11:59:34.895984 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:59:34.914351 master-0 kubenswrapper[17644]: I0319 11:59:34.914264 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 19 11:59:34.916002 master-0 kubenswrapper[17644]: I0319 11:59:34.915964 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-images\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:59:34.934530 master-0 kubenswrapper[17644]: I0319 11:59:34.934453 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 19 11:59:34.944506 master-0 kubenswrapper[17644]: I0319 11:59:34.944446 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:59:34.954460 master-0 kubenswrapper[17644]: I0319 11:59:34.954191 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 19 11:59:34.974314 master-0 kubenswrapper[17644]: I0319 11:59:34.974241 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-7qzrj" Mar 19 11:59:34.994216 master-0 kubenswrapper[17644]: I0319 11:59:34.994076 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 19 11:59:35.004239 master-0 kubenswrapper[17644]: I0319 11:59:35.004204 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:59:35.014946 master-0 kubenswrapper[17644]: I0319 11:59:35.014895 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 19 11:59:35.015324 master-0 kubenswrapper[17644]: I0319 11:59:35.015288 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f4aad0ff-e6cd-4c43-9561-80a14fee4712-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:59:35.033565 master-0 kubenswrapper[17644]: I0319 11:59:35.033521 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 19 11:59:35.035266 master-0 kubenswrapper[17644]: I0319 11:59:35.035231 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:59:35.054077 master-0 kubenswrapper[17644]: I0319 11:59:35.054018 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-strbt" Mar 19 11:59:35.074755 master-0 kubenswrapper[17644]: I0319 11:59:35.074687 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 19 11:59:35.076309 master-0 kubenswrapper[17644]: I0319 11:59:35.076047 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d06b230b-db67-4afc-8d10-2c33ad568462-node-exporter-tls\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:59:35.094696 master-0 kubenswrapper[17644]: I0319 11:59:35.094624 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-8skrb" Mar 19 11:59:35.113211 master-0 kubenswrapper[17644]: I0319 11:59:35.113153 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 19 11:59:35.115606 master-0 kubenswrapper[17644]: I0319 11:59:35.115576 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:59:35.134328 master-0 kubenswrapper[17644]: I0319 11:59:35.134254 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-zf4zz" Mar 19 11:59:35.155050 master-0 kubenswrapper[17644]: I0319 11:59:35.154979 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 19 11:59:35.169651 master-0 kubenswrapper[17644]: I0319 11:59:35.169578 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:59:35.169651 master-0 kubenswrapper[17644]: E0319 11:59:35.169237 17644 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-server-audit-profiles: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:35.169651 master-0 kubenswrapper[17644]: E0319 11:59:35.169235 17644 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:35.169651 master-0 kubenswrapper[17644]: E0319 11:59:35.169405 17644 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:35.169651 master-0 kubenswrapper[17644]: E0319 11:59:35.169439 17644 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:35.170195 master-0 kubenswrapper[17644]: E0319 11:59:35.169495 17644 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:35.170195 master-0 kubenswrapper[17644]: E0319 11:59:35.169499 17644 configmap.go:193] Couldn't get configMap openshift-monitoring/kubelet-serving-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:35.170195 master-0 kubenswrapper[17644]: E0319 11:59:35.169525 17644 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-6ro5itlgu7nag: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:35.170195 master-0 kubenswrapper[17644]: E0319 11:59:35.169546 17644 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:35.170195 master-0 kubenswrapper[17644]: E0319 11:59:35.169535 17644 secret.go:189] Couldn't get secret openshift-monitoring/metrics-client-certs: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:35.170195 master-0 kubenswrapper[17644]: E0319 11:59:35.169525 17644 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:35.170195 master-0 kubenswrapper[17644]: E0319 11:59:35.169573 17644 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:35.170195 master-0 kubenswrapper[17644]: E0319 11:59:35.169578 17644 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:35.170195 master-0 kubenswrapper[17644]: E0319 11:59:35.170063 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-configmap-kubelet-serving-ca-bundle podName:5f8c022c-7871-4765-971f-dcafa39357c9 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:36.170038402 +0000 UTC m=+9.939996437 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "configmap-kubelet-serving-ca-bundle" (UniqueName: "kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-configmap-kubelet-serving-ca-bundle") pod "metrics-server-64d6dd6b7b-xdrz5" (UID: "5f8c022c-7871-4765-971f-dcafa39357c9") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:35.170195 master-0 kubenswrapper[17644]: E0319 11:59:35.170099 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-metrics-server-audit-profiles podName:5f8c022c-7871-4765-971f-dcafa39357c9 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:36.170091643 +0000 UTC m=+9.940049678 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-server-audit-profiles" (UniqueName: "kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-metrics-server-audit-profiles") pod "metrics-server-64d6dd6b7b-xdrz5" (UID: "5f8c022c-7871-4765-971f-dcafa39357c9") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:35.170195 master-0 kubenswrapper[17644]: E0319 11:59:35.170129 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-openshift-state-metrics-kube-rbac-proxy-config podName:dedf55c4-eeda-4955-aafe-db1fdb8c4a48 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:36.170123044 +0000 UTC m=+9.940081079 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-openshift-state-metrics-kube-rbac-proxy-config") pod "openshift-state-metrics-5dc6c74576-lwqmn" (UID: "dedf55c4-eeda-4955-aafe-db1fdb8c4a48") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:35.170195 master-0 kubenswrapper[17644]: E0319 11:59:35.170157 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4aad0ff-e6cd-4c43-9561-80a14fee4712-prometheus-operator-tls podName:f4aad0ff-e6cd-4c43-9561-80a14fee4712 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:36.170148554 +0000 UTC m=+9.940106589 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/f4aad0ff-e6cd-4c43-9561-80a14fee4712-prometheus-operator-tls") pod "prometheus-operator-6c8df6d4b-xfwkr" (UID: "f4aad0ff-e6cd-4c43-9561-80a14fee4712") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:35.170195 master-0 kubenswrapper[17644]: E0319 11:59:35.170178 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-client-ca-bundle podName:5f8c022c-7871-4765-971f-dcafa39357c9 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:36.170168245 +0000 UTC m=+9.940126290 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-client-ca-bundle") pod "metrics-server-64d6dd6b7b-xdrz5" (UID: "5f8c022c-7871-4765-971f-dcafa39357c9") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:35.170195 master-0 kubenswrapper[17644]: E0319 11:59:35.170200 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-metrics-client-ca podName:2d63d5a8-f45d-4678-824d-5534b2bcd6ca nodeName:}" failed. No retries permitted until 2026-03-19 11:59:36.170191705 +0000 UTC m=+9.940149940 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-metrics-client-ca") pod "kube-state-metrics-7bbc969446-xkg9f" (UID: "2d63d5a8-f45d-4678-824d-5534b2bcd6ca") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:35.170195 master-0 kubenswrapper[17644]: E0319 11:59:35.170221 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-client-certs podName:5f8c022c-7871-4765-971f-dcafa39357c9 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:36.170211796 +0000 UTC m=+9.940170031 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-metrics-client-certs" (UniqueName: "kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-client-certs") pod "metrics-server-64d6dd6b7b-xdrz5" (UID: "5f8c022c-7871-4765-971f-dcafa39357c9") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:35.170808 master-0 kubenswrapper[17644]: E0319 11:59:35.170253 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-metrics-client-ca podName:dedf55c4-eeda-4955-aafe-db1fdb8c4a48 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:36.170228586 +0000 UTC m=+9.940186801 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-metrics-client-ca") pod "openshift-state-metrics-5dc6c74576-lwqmn" (UID: "dedf55c4-eeda-4955-aafe-db1fdb8c4a48") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:35.170808 master-0 kubenswrapper[17644]: E0319 11:59:35.170286 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-openshift-state-metrics-tls podName:dedf55c4-eeda-4955-aafe-db1fdb8c4a48 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:36.170275377 +0000 UTC m=+9.940233612 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-openshift-state-metrics-tls") pod "openshift-state-metrics-5dc6c74576-lwqmn" (UID: "dedf55c4-eeda-4955-aafe-db1fdb8c4a48") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:35.170808 master-0 kubenswrapper[17644]: E0319 11:59:35.170306 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-server-tls podName:5f8c022c-7871-4765-971f-dcafa39357c9 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:36.170296558 +0000 UTC m=+9.940254793 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-metrics-server-tls" (UniqueName: "kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-server-tls") pod "metrics-server-64d6dd6b7b-xdrz5" (UID: "5f8c022c-7871-4765-971f-dcafa39357c9") : failed to sync secret cache: timed out waiting for the condition Mar 19 11:59:35.170808 master-0 kubenswrapper[17644]: E0319 11:59:35.170328 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d06b230b-db67-4afc-8d10-2c33ad568462-metrics-client-ca podName:d06b230b-db67-4afc-8d10-2c33ad568462 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:36.170318598 +0000 UTC m=+9.940276853 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/d06b230b-db67-4afc-8d10-2c33ad568462-metrics-client-ca") pod "node-exporter-pnb9m" (UID: "d06b230b-db67-4afc-8d10-2c33ad568462") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:35.170808 master-0 kubenswrapper[17644]: E0319 11:59:35.170349 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4aad0ff-e6cd-4c43-9561-80a14fee4712-metrics-client-ca podName:f4aad0ff-e6cd-4c43-9561-80a14fee4712 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:36.170340489 +0000 UTC m=+9.940298734 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/f4aad0ff-e6cd-4c43-9561-80a14fee4712-metrics-client-ca") pod "prometheus-operator-6c8df6d4b-xfwkr" (UID: "f4aad0ff-e6cd-4c43-9561-80a14fee4712") : failed to sync configmap cache: timed out waiting for the condition Mar 19 11:59:35.175948 master-0 kubenswrapper[17644]: I0319 11:59:35.175833 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 19 11:59:35.195067 master-0 kubenswrapper[17644]: I0319 11:59:35.195011 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-w94s8" Mar 19 11:59:35.213913 master-0 kubenswrapper[17644]: I0319 11:59:35.213709 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 19 11:59:35.233128 master-0 kubenswrapper[17644]: I0319 11:59:35.233065 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-68jgh" Mar 19 11:59:35.253846 master-0 kubenswrapper[17644]: I0319 11:59:35.253699 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 19 11:59:35.274514 master-0 kubenswrapper[17644]: I0319 11:59:35.274433 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 19 11:59:35.294359 master-0 kubenswrapper[17644]: I0319 11:59:35.294251 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 19 11:59:35.313763 master-0 kubenswrapper[17644]: I0319 11:59:35.313674 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 19 11:59:35.335161 master-0 kubenswrapper[17644]: I0319 11:59:35.335084 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 19 11:59:35.353614 master-0 kubenswrapper[17644]: I0319 11:59:35.353527 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-6ro5itlgu7nag" Mar 19 11:59:35.373846 master-0 kubenswrapper[17644]: I0319 11:59:35.373772 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 19 11:59:35.404691 master-0 kubenswrapper[17644]: E0319 11:59:35.404594 17644 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="2.909s" Mar 19 11:59:35.413700 master-0 kubenswrapper[17644]: I0319 11:59:35.413652 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 19 11:59:35.452590 master-0 kubenswrapper[17644]: I0319 11:59:35.452497 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qql5t\" (UniqueName: \"kubernetes.io/projected/b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d-kube-api-access-qql5t\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w\" (UID: \"b1fa5ba9-eefb-4754-9a2d-b8a5bb328b8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-7g57w" Mar 19 11:59:35.469147 master-0 kubenswrapper[17644]: I0319 11:59:35.469063 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5rm4\" (UniqueName: \"kubernetes.io/projected/e5078f17-bc65-460f-9f18-8c506db6840b-kube-api-access-s5rm4\") pod \"package-server-manager-7b95f86987-jq5vq\" (UID: \"e5078f17-bc65-460f-9f18-8c506db6840b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:59:35.495062 master-0 kubenswrapper[17644]: I0319 11:59:35.494965 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bb2x\" (UniqueName: \"kubernetes.io/projected/3053504d-0734-4def-b639-0f5cc2178185-kube-api-access-2bb2x\") pod \"ovnkube-node-4qxkd\" (UID: \"3053504d-0734-4def-b639-0f5cc2178185\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:35.509977 master-0 kubenswrapper[17644]: I0319 11:59:35.509795 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrgqb\" (UniqueName: \"kubernetes.io/projected/a3ceeece-bee9-4fcb-8517-95ebce38e223-kube-api-access-zrgqb\") pod \"openshift-config-operator-95bf4f4d-ng9ss\" (UID: \"a3ceeece-bee9-4fcb-8517-95ebce38e223\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:59:35.512134 master-0 kubenswrapper[17644]: I0319 11:59:35.512076 17644 request.go:700] Waited for 2.9092349s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/serviceaccounts/machine-api-operator/token Mar 19 11:59:35.527744 master-0 kubenswrapper[17644]: I0319 11:59:35.527662 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd6rv\" (UniqueName: \"kubernetes.io/projected/75aedbcd-f6ed-43a1-941b-4b04887ffe8e-kube-api-access-dd6rv\") pod \"machine-api-operator-6fbb6cf6f9-jf7p6\" (UID: \"75aedbcd-f6ed-43a1-941b-4b04887ffe8e\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-jf7p6" Mar 19 11:59:35.555872 master-0 kubenswrapper[17644]: I0319 11:59:35.555299 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dg9r\" (UniqueName: \"kubernetes.io/projected/6870ccc7-2094-48d8-9238-f486a4b8d5af-kube-api-access-9dg9r\") pod \"machine-config-server-ltk8s\" (UID: \"6870ccc7-2094-48d8-9238-f486a4b8d5af\") " pod="openshift-machine-config-operator/machine-config-server-ltk8s" Mar 19 11:59:35.579171 master-0 kubenswrapper[17644]: I0319 11:59:35.579089 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj527\" (UniqueName: \"kubernetes.io/projected/76cf2b01-33d9-47eb-be5d-44946c78bf20-kube-api-access-nj527\") pod \"controller-manager-548bb99f44-txbjj\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:59:35.587056 master-0 kubenswrapper[17644]: I0319 11:59:35.587015 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7nhq\" (UniqueName: \"kubernetes.io/projected/92e401a4-ed2f-46f7-924b-329d7b313e6a-kube-api-access-c7nhq\") pod \"cluster-baremetal-operator-6f69995874-942g6\" (UID: \"92e401a4-ed2f-46f7-924b-329d7b313e6a\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" Mar 19 11:59:35.608230 master-0 kubenswrapper[17644]: I0319 11:59:35.608158 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-894bt\" (UniqueName: \"kubernetes.io/projected/cf6aab0e-defc-4a4b-8a07-f5af8bf177c4-kube-api-access-894bt\") pod \"redhat-marketplace-ccbc5\" (UID: \"cf6aab0e-defc-4a4b-8a07-f5af8bf177c4\") " pod="openshift-marketplace/redhat-marketplace-ccbc5" Mar 19 11:59:35.626827 master-0 kubenswrapper[17644]: I0319 11:59:35.626716 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p4hg\" (UniqueName: \"kubernetes.io/projected/6611e325-6152-480c-9c2c-1b503e49ccd2-kube-api-access-4p4hg\") pod \"cluster-olm-operator-67dcd4998-rgbzk\" (UID: \"6611e325-6152-480c-9c2c-1b503e49ccd2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-rgbzk" Mar 19 11:59:35.645796 master-0 kubenswrapper[17644]: I0319 11:59:35.645718 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pngsr\" (UniqueName: \"kubernetes.io/projected/3c3b0d24-ce5e-49c3-a546-874356f75dc6-kube-api-access-pngsr\") pod \"network-operator-7bd846bfc4-7fz6w\" (UID: \"3c3b0d24-ce5e-49c3-a546-874356f75dc6\") " pod="openshift-network-operator/network-operator-7bd846bfc4-7fz6w" Mar 19 11:59:35.667253 master-0 kubenswrapper[17644]: I0319 11:59:35.667172 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nfnb\" (UniqueName: \"kubernetes.io/projected/aaaaf539-bf61-44d7-8d47-97535b7aa1ba-kube-api-access-7nfnb\") pod \"cluster-node-tuning-operator-598fbc5f8f-kb5vd\" (UID: \"aaaaf539-bf61-44d7-8d47-97535b7aa1ba\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-kb5vd" Mar 19 11:59:35.687354 master-0 kubenswrapper[17644]: I0319 11:59:35.687293 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9ddk\" (UniqueName: \"kubernetes.io/projected/00dd3703-af25-4e71-b20b-b3e153383489-kube-api-access-k9ddk\") pod \"certified-operators-gwt6h\" (UID: \"00dd3703-af25-4e71-b20b-b3e153383489\") " pod="openshift-marketplace/certified-operators-gwt6h" Mar 19 11:59:35.705898 master-0 kubenswrapper[17644]: I0319 11:59:35.705830 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx4wk\" (UniqueName: \"kubernetes.io/projected/09a22c25-6073-4b1a-a029-928452ef37db-kube-api-access-xx4wk\") pod \"multus-552pc\" (UID: \"09a22c25-6073-4b1a-a029-928452ef37db\") " pod="openshift-multus/multus-552pc" Mar 19 11:59:35.732314 master-0 kubenswrapper[17644]: I0319 11:59:35.732189 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr788\" (UniqueName: \"kubernetes.io/projected/a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc-kube-api-access-dr788\") pod \"machine-approver-5c6485487f-5zvc5\" (UID: \"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" Mar 19 11:59:35.750008 master-0 kubenswrapper[17644]: I0319 11:59:35.749935 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trcb7\" (UniqueName: \"kubernetes.io/projected/e2ad29ad-70ef-43c6-91f6-02f04d145673-kube-api-access-trcb7\") pod \"router-default-7dcf5569b5-kpmgt\" (UID: \"e2ad29ad-70ef-43c6-91f6-02f04d145673\") " pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:59:35.778259 master-0 kubenswrapper[17644]: I0319 11:59:35.778129 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/39d3ac31-9259-454b-8e1c-e23024f8f2b2-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-pjc7h\" (UID: \"39d3ac31-9259-454b-8e1c-e23024f8f2b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-pjc7h" Mar 19 11:59:35.789592 master-0 kubenswrapper[17644]: I0319 11:59:35.789347 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v27lg\" (UniqueName: \"kubernetes.io/projected/f5d73fef-1414-4b29-97ea-42e1c0b1ef18-kube-api-access-v27lg\") pod \"service-ca-operator-b865698dc-md7m5\" (UID: \"f5d73fef-1414-4b29-97ea-42e1c0b1ef18\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-md7m5" Mar 19 11:59:35.817506 master-0 kubenswrapper[17644]: I0319 11:59:35.817430 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8bmw\" (UniqueName: \"kubernetes.io/projected/716c2176-50f9-4c4f-af0e-4c7973457df2-kube-api-access-m8bmw\") pod \"olm-operator-5c9796789-l9sw9\" (UID: \"716c2176-50f9-4c4f-af0e-4c7973457df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:59:35.825266 master-0 kubenswrapper[17644]: I0319 11:59:35.825209 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79qrb\" (UniqueName: \"kubernetes.io/projected/681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9-kube-api-access-79qrb\") pod \"cluster-monitoring-operator-58845fbb57-tkcwh\" (UID: \"681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-tkcwh" Mar 19 11:59:35.844980 master-0 kubenswrapper[17644]: I0319 11:59:35.844903 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7784\" (UniqueName: \"kubernetes.io/projected/8376e1f9-ab05-42d4-aa66-284a167a9bfc-kube-api-access-n7784\") pod \"tuned-x6mmm\" (UID: \"8376e1f9-ab05-42d4-aa66-284a167a9bfc\") " pod="openshift-cluster-node-tuning-operator/tuned-x6mmm" Mar 19 11:59:35.867640 master-0 kubenswrapper[17644]: I0319 11:59:35.867562 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lscpq\" (UniqueName: \"kubernetes.io/projected/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-kube-api-access-lscpq\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:59:35.886795 master-0 kubenswrapper[17644]: I0319 11:59:35.886705 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt6bf\" (UniqueName: \"kubernetes.io/projected/1c898657-f06b-44ab-95ff-53a324759ba1-kube-api-access-mt6bf\") pod \"node-resolver-pm77f\" (UID: \"1c898657-f06b-44ab-95ff-53a324759ba1\") " pod="openshift-dns/node-resolver-pm77f" Mar 19 11:59:35.906793 master-0 kubenswrapper[17644]: I0319 11:59:35.906704 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rt57\" (UniqueName: \"kubernetes.io/projected/2292109e-92a9-4286-858e-dcd2ac083c43-kube-api-access-8rt57\") pod \"csi-snapshot-controller-operator-5f5d689c6b-fx8ng\" (UID: \"2292109e-92a9-4286-858e-dcd2ac083c43\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-fx8ng" Mar 19 11:59:35.927621 master-0 kubenswrapper[17644]: I0319 11:59:35.927533 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkm97\" (UniqueName: \"kubernetes.io/projected/cf08ab4f-c203-4c16-9826-8cc049f4af31-kube-api-access-lkm97\") pod \"catalog-operator-68f85b4d6c-n5gr9\" (UID: \"cf08ab4f-c203-4c16-9826-8cc049f4af31\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:59:35.947038 master-0 kubenswrapper[17644]: I0319 11:59:35.946969 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzrh8\" (UniqueName: \"kubernetes.io/projected/6230ed8f-4608-4168-8f5a-656f411b0ef7-kube-api-access-wzrh8\") pod \"network-check-target-cr8n7\" (UID: \"6230ed8f-4608-4168-8f5a-656f411b0ef7\") " pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:59:35.968452 master-0 kubenswrapper[17644]: I0319 11:59:35.968371 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fvvj\" (UniqueName: \"kubernetes.io/projected/e65e2a2f-16b5-44a3-9860-741f70188ab5-kube-api-access-4fvvj\") pod \"network-check-source-b4bf74f6-llsdf\" (UID: \"e65e2a2f-16b5-44a3-9860-741f70188ab5\") " pod="openshift-network-diagnostics/network-check-source-b4bf74f6-llsdf" Mar 19 11:59:35.989412 master-0 kubenswrapper[17644]: I0319 11:59:35.989331 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdx6s\" (UniqueName: \"kubernetes.io/projected/12809811-c9df-4e77-8c12-309831b8975d-kube-api-access-bdx6s\") pod \"machine-config-controller-b4f87c5b9-lg6h9\" (UID: \"12809811-c9df-4e77-8c12-309831b8975d\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-lg6h9" Mar 19 11:59:36.008194 master-0 kubenswrapper[17644]: I0319 11:59:36.008109 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqr6w\" (UniqueName: \"kubernetes.io/projected/8438d015-106b-4aed-ae12-dda781ce51fc-kube-api-access-cqr6w\") pod \"network-node-identity-j528w\" (UID: \"8438d015-106b-4aed-ae12-dda781ce51fc\") " pod="openshift-network-node-identity/network-node-identity-j528w" Mar 19 11:59:36.027520 master-0 kubenswrapper[17644]: I0319 11:59:36.027421 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbhv4\" (UniqueName: \"kubernetes.io/projected/ac09dba7-398c-4b0a-a415-edb73cb4cf30-kube-api-access-pbhv4\") pod \"cluster-autoscaler-operator-866dc4744-dnx7f\" (UID: \"ac09dba7-398c-4b0a-a415-edb73cb4cf30\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-dnx7f" Mar 19 11:59:36.047079 master-0 kubenswrapper[17644]: I0319 11:59:36.046894 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g997b\" (UniqueName: \"kubernetes.io/projected/5f8c022c-7871-4765-971f-dcafa39357c9-kube-api-access-g997b\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:36.067580 master-0 kubenswrapper[17644]: I0319 11:59:36.067502 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72jlb\" (UniqueName: \"kubernetes.io/projected/daf4dbb6-5a0a-4c92-a930-479a7330ace1-kube-api-access-72jlb\") pod \"ovnkube-control-plane-57f769d897-zs6dd\" (UID: \"daf4dbb6-5a0a-4c92-a930-479a7330ace1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" Mar 19 11:59:36.086430 master-0 kubenswrapper[17644]: I0319 11:59:36.086343 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbq7n\" (UniqueName: \"kubernetes.io/projected/d1eef757-d63a-4708-8efe-7b27eea1ff63-kube-api-access-kbq7n\") pod \"community-operators-h668l\" (UID: \"d1eef757-d63a-4708-8efe-7b27eea1ff63\") " pod="openshift-marketplace/community-operators-h668l" Mar 19 11:59:36.106593 master-0 kubenswrapper[17644]: I0319 11:59:36.106535 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m287x\" (UniqueName: \"kubernetes.io/projected/24f71770-714e-4111-9188-ad8663c6baa7-kube-api-access-m287x\") pod \"machine-config-daemon-mgzld\" (UID: \"24f71770-714e-4111-9188-ad8663c6baa7\") " pod="openshift-machine-config-operator/machine-config-daemon-mgzld" Mar 19 11:59:36.127267 master-0 kubenswrapper[17644]: I0319 11:59:36.127190 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbcbba74-ac53-4724-a217-4d9b85e7c1db-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-5gvgh\" (UID: \"dbcbba74-ac53-4724-a217-4d9b85e7c1db\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-5gvgh" Mar 19 11:59:36.146408 master-0 kubenswrapper[17644]: I0319 11:59:36.146319 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgs4l\" (UniqueName: \"kubernetes.io/projected/f29b11ce-60e0-46b3-8d28-eea3452513cd-kube-api-access-bgs4l\") pod \"network-metrics-daemon-f6wv7\" (UID: \"f29b11ce-60e0-46b3-8d28-eea3452513cd\") " pod="openshift-multus/network-metrics-daemon-f6wv7" Mar 19 11:59:36.166859 master-0 kubenswrapper[17644]: I0319 11:59:36.166772 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7spvn\" (UniqueName: \"kubernetes.io/projected/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-kube-api-access-7spvn\") pod \"multus-admission-controller-5dbbb8b86f-wdwkz\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 11:59:36.190775 master-0 kubenswrapper[17644]: I0319 11:59:36.190653 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5fnx\" (UniqueName: \"kubernetes.io/projected/66f88242-8b0b-4790-bbb6-445c19b34ee7-kube-api-access-p5fnx\") pod \"openshift-apiserver-operator-d65958b8-6hsqn\" (UID: \"66f88242-8b0b-4790-bbb6-445c19b34ee7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-6hsqn" Mar 19 11:59:36.204297 master-0 kubenswrapper[17644]: I0319 11:59:36.203813 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d06b230b-db67-4afc-8d10-2c33ad568462-metrics-client-ca\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:59:36.204297 master-0 kubenswrapper[17644]: I0319 11:59:36.203911 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:59:36.204297 master-0 kubenswrapper[17644]: I0319 11:59:36.204163 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-server-tls\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:36.204297 master-0 kubenswrapper[17644]: I0319 11:59:36.204212 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-metrics-client-ca\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:59:36.204297 master-0 kubenswrapper[17644]: I0319 11:59:36.204284 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d06b230b-db67-4afc-8d10-2c33ad568462-metrics-client-ca\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:59:36.204719 master-0 kubenswrapper[17644]: I0319 11:59:36.204367 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:59:36.204719 master-0 kubenswrapper[17644]: I0319 11:59:36.204493 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-client-certs\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:36.204719 master-0 kubenswrapper[17644]: I0319 11:59:36.204563 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-server-tls\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:36.205355 master-0 kubenswrapper[17644]: I0319 11:59:36.204746 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-metrics-server-audit-profiles\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:36.205355 master-0 kubenswrapper[17644]: I0319 11:59:36.204766 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-client-certs\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:36.205355 master-0 kubenswrapper[17644]: I0319 11:59:36.204788 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-metrics-client-ca\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:59:36.205355 master-0 kubenswrapper[17644]: I0319 11:59:36.204995 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4aad0ff-e6cd-4c43-9561-80a14fee4712-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:59:36.205355 master-0 kubenswrapper[17644]: I0319 11:59:36.205031 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f4aad0ff-e6cd-4c43-9561-80a14fee4712-metrics-client-ca\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:59:36.205355 master-0 kubenswrapper[17644]: I0319 11:59:36.205046 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-metrics-server-audit-profiles\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:36.205355 master-0 kubenswrapper[17644]: I0319 11:59:36.205208 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:36.205355 master-0 kubenswrapper[17644]: I0319 11:59:36.205258 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:59:36.205355 master-0 kubenswrapper[17644]: I0319 11:59:36.205269 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f4aad0ff-e6cd-4c43-9561-80a14fee4712-metrics-client-ca\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:59:36.205355 master-0 kubenswrapper[17644]: I0319 11:59:36.205293 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-client-ca-bundle\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:36.205355 master-0 kubenswrapper[17644]: I0319 11:59:36.205316 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-metrics-client-ca\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:59:36.205662 master-0 kubenswrapper[17644]: I0319 11:59:36.205542 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-metrics-client-ca\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:59:36.205662 master-0 kubenswrapper[17644]: I0319 11:59:36.205547 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4aad0ff-e6cd-4c43-9561-80a14fee4712-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:59:36.205716 master-0 kubenswrapper[17644]: I0319 11:59:36.205699 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dedf55c4-eeda-4955-aafe-db1fdb8c4a48-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5dc6c74576-lwqmn\" (UID: \"dedf55c4-eeda-4955-aafe-db1fdb8c4a48\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-lwqmn" Mar 19 11:59:36.205916 master-0 kubenswrapper[17644]: I0319 11:59:36.205817 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:36.205959 master-0 kubenswrapper[17644]: I0319 11:59:36.205943 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-client-ca-bundle\") pod \"metrics-server-64d6dd6b7b-xdrz5\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:36.206530 master-0 kubenswrapper[17644]: I0319 11:59:36.206505 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b61ea14-a7ea-49f3-9df4-5655765ddf7c-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-4wj9n\" (UID: \"9b61ea14-a7ea-49f3-9df4-5655765ddf7c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-4wj9n" Mar 19 11:59:36.226489 master-0 kubenswrapper[17644]: I0319 11:59:36.226417 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2hrw\" (UniqueName: \"kubernetes.io/projected/376b18a9-5f33-44fd-a37b-20ab02c5e65d-kube-api-access-f2hrw\") pod \"catalogd-controller-manager-6864dc98f7-xzxpq\" (UID: \"376b18a9-5f33-44fd-a37b-20ab02c5e65d\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:59:36.245723 master-0 kubenswrapper[17644]: I0319 11:59:36.245669 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvkxx\" (UniqueName: \"kubernetes.io/projected/e45616db-f7dd-4a08-847f-abf2759d9fa4-kube-api-access-dvkxx\") pod \"apiserver-899bc59d8-xxr9r\" (UID: \"e45616db-f7dd-4a08-847f-abf2759d9fa4\") " pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:59:36.265451 master-0 kubenswrapper[17644]: I0319 11:59:36.265401 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfvz6\" (UniqueName: \"kubernetes.io/projected/732989c5-1b89-46f0-9917-b68613f7f005-kube-api-access-bfvz6\") pod \"authentication-operator-5885bfd7f4-gqd94\" (UID: \"732989c5-1b89-46f0-9917-b68613f7f005\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-gqd94" Mar 19 11:59:36.288918 master-0 kubenswrapper[17644]: I0319 11:59:36.288874 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfxw7\" (UniqueName: \"kubernetes.io/projected/7a51eeaf-1349-4bf3-932d-22ed5ce7c161-kube-api-access-cfxw7\") pod \"control-plane-machine-set-operator-6f97756bc8-j7rc9\" (UID: \"7a51eeaf-1349-4bf3-932d-22ed5ce7c161\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-j7rc9" Mar 19 11:59:36.306096 master-0 kubenswrapper[17644]: I0319 11:59:36.305985 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd6ec279-d92f-45c2-97c2-88b96fbd6600-kube-api-access\") pod \"cluster-version-operator-7d58488df-rcbf8\" (UID: \"dd6ec279-d92f-45c2-97c2-88b96fbd6600\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-rcbf8" Mar 19 11:59:36.325711 master-0 kubenswrapper[17644]: I0319 11:59:36.325642 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5skx\" (UniqueName: \"kubernetes.io/projected/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-kube-api-access-n5skx\") pod \"route-controller-manager-864f875b6b-rcjvd\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:59:36.351822 master-0 kubenswrapper[17644]: I0319 11:59:36.351753 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v9bx\" (UniqueName: \"kubernetes.io/projected/0cbbe8d0-aafb-499f-a1f4-affcea62c1ab-kube-api-access-8v9bx\") pod \"cloud-credential-operator-744f9dbf77-mhvls\" (UID: \"0cbbe8d0-aafb-499f-a1f4-affcea62c1ab\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-mhvls" Mar 19 11:59:36.365043 master-0 kubenswrapper[17644]: I0319 11:59:36.364999 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5th4l\" (UniqueName: \"kubernetes.io/projected/6e76fc3f-39a4-4f99-8603-38a94da6ea8e-kube-api-access-5th4l\") pod \"service-ca-79bc6b8d76-lzfbh\" (UID: \"6e76fc3f-39a4-4f99-8603-38a94da6ea8e\") " pod="openshift-service-ca/service-ca-79bc6b8d76-lzfbh" Mar 19 11:59:36.384546 master-0 kubenswrapper[17644]: I0319 11:59:36.384466 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwrd5\" (UniqueName: \"kubernetes.io/projected/2d63d5a8-f45d-4678-824d-5534b2bcd6ca-kube-api-access-kwrd5\") pod \"kube-state-metrics-7bbc969446-xkg9f\" (UID: \"2d63d5a8-f45d-4678-824d-5534b2bcd6ca\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-xkg9f" Mar 19 11:59:36.412169 master-0 kubenswrapper[17644]: I0319 11:59:36.412086 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bbtl\" (UniqueName: \"kubernetes.io/projected/d06b230b-db67-4afc-8d10-2c33ad568462-kube-api-access-4bbtl\") pod \"node-exporter-pnb9m\" (UID: \"d06b230b-db67-4afc-8d10-2c33ad568462\") " pod="openshift-monitoring/node-exporter-pnb9m" Mar 19 11:59:36.427002 master-0 kubenswrapper[17644]: I0319 11:59:36.426543 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/163d6a3d-0080-4122-bb7a-17f6e63f00f0-bound-sa-token\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:59:36.453429 master-0 kubenswrapper[17644]: I0319 11:59:36.453343 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgz7q\" (UniqueName: \"kubernetes.io/projected/4e2c195f-e97d-4cac-81fc-2d5c551d1c30-kube-api-access-kgz7q\") pod \"iptables-alerter-n52gc\" (UID: \"4e2c195f-e97d-4cac-81fc-2d5c551d1c30\") " pod="openshift-network-operator/iptables-alerter-n52gc" Mar 19 11:59:36.464769 master-0 kubenswrapper[17644]: I0319 11:59:36.464691 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5jsb\" (UniqueName: \"kubernetes.io/projected/c8d8a09f-22d5-4f16-84d6-d5f2c504c949-kube-api-access-p5jsb\") pod \"cluster-cloud-controller-manager-operator-7dff898856-87z86\" (UID: \"c8d8a09f-22d5-4f16-84d6-d5f2c504c949\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" Mar 19 11:59:36.486271 master-0 kubenswrapper[17644]: I0319 11:59:36.486213 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg6sp\" (UniqueName: \"kubernetes.io/projected/cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103-kube-api-access-hg6sp\") pod \"redhat-operators-w2fqh\" (UID: \"cbfd5667-f6f4-4c7c-92b2-ea4ecd0f0103\") " pod="openshift-marketplace/redhat-operators-w2fqh" Mar 19 11:59:36.507288 master-0 kubenswrapper[17644]: I0319 11:59:36.507157 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srlcl\" (UniqueName: \"kubernetes.io/projected/1b94d1eb-1b80-4a14-b1c0-d9e192231352-kube-api-access-srlcl\") pod \"operator-controller-controller-manager-57777556ff-mjwfm\" (UID: \"1b94d1eb-1b80-4a14-b1c0-d9e192231352\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:59:36.525068 master-0 kubenswrapper[17644]: I0319 11:59:36.524989 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jptl6\" (UniqueName: \"kubernetes.io/projected/034cad93-a500-4c58-8d97-fa49866a0d5e-kube-api-access-jptl6\") pod \"insights-operator-68bf6ff9d6-djfg8\" (UID: \"034cad93-a500-4c58-8d97-fa49866a0d5e\") " pod="openshift-insights/insights-operator-68bf6ff9d6-djfg8" Mar 19 11:59:36.532500 master-0 kubenswrapper[17644]: I0319 11:59:36.532451 17644 request.go:700] Waited for 3.915122352s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/serviceaccounts/prometheus-operator/token Mar 19 11:59:36.547983 master-0 kubenswrapper[17644]: I0319 11:59:36.547846 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zndqq\" (UniqueName: \"kubernetes.io/projected/f4aad0ff-e6cd-4c43-9561-80a14fee4712-kube-api-access-zndqq\") pod \"prometheus-operator-6c8df6d4b-xfwkr\" (UID: \"f4aad0ff-e6cd-4c43-9561-80a14fee4712\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-xfwkr" Mar 19 11:59:36.567068 master-0 kubenswrapper[17644]: I0319 11:59:36.566886 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nds54\" (UniqueName: \"kubernetes.io/projected/e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf-kube-api-access-nds54\") pod \"openshift-controller-manager-operator-8c94f4649-6ghdm\" (UID: \"e7f0a5ee-5e7a-4946-bffa-5d98aa5890bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-6ghdm" Mar 19 11:59:36.587992 master-0 kubenswrapper[17644]: I0319 11:59:36.587915 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx487\" (UniqueName: \"kubernetes.io/projected/b5c7eb66-e23e-40df-883c-fed012c07f26-kube-api-access-tx487\") pod \"machine-config-operator-84d549f6d5-66wvv\" (UID: \"b5c7eb66-e23e-40df-883c-fed012c07f26\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-66wvv" Mar 19 11:59:36.605960 master-0 kubenswrapper[17644]: I0319 11:59:36.605889 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88ghj\" (UniqueName: \"kubernetes.io/projected/e48b5aa9-293e-4222-91ff-7640addeca4c-kube-api-access-88ghj\") pod \"apiserver-f67f6868b-chx8j\" (UID: \"e48b5aa9-293e-4222-91ff-7640addeca4c\") " pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:36.627520 master-0 kubenswrapper[17644]: I0319 11:59:36.627462 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlr9q\" (UniqueName: \"kubernetes.io/projected/b3de8a1b-a5be-414f-86e8-738e16c8bc97-kube-api-access-nlr9q\") pod \"marketplace-operator-89ccd998f-bftt4\" (UID: \"b3de8a1b-a5be-414f-86e8-738e16c8bc97\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:59:36.646711 master-0 kubenswrapper[17644]: I0319 11:59:36.646613 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46m89\" (UniqueName: \"kubernetes.io/projected/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-kube-api-access-46m89\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:59:36.665956 master-0 kubenswrapper[17644]: I0319 11:59:36.665897 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mxjl\" (UniqueName: \"kubernetes.io/projected/1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c-kube-api-access-2mxjl\") pod \"dns-default-ztgjs\" (UID: \"1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c\") " pod="openshift-dns/dns-default-ztgjs" Mar 19 11:59:36.687285 master-0 kubenswrapper[17644]: I0319 11:59:36.687228 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-nrtp2\" (UID: \"a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-nrtp2" Mar 19 11:59:36.713769 master-0 kubenswrapper[17644]: I0319 11:59:36.713689 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7bq7\" (UniqueName: \"kubernetes.io/projected/6d41245b-33d4-40f8-bbe1-6d2247e2e335-kube-api-access-k7bq7\") pod \"packageserver-bbf67c86c-n58nq\" (UID: \"6d41245b-33d4-40f8-bbe1-6d2247e2e335\") " pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:59:36.726819 master-0 kubenswrapper[17644]: I0319 11:59:36.726758 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnl28\" (UniqueName: \"kubernetes.io/projected/8fe4839d-cef4-4ec9-b146-2ae9b76d8a76-kube-api-access-dnl28\") pod \"etcd-operator-8544cbcf9c-9w7hc\" (UID: \"8fe4839d-cef4-4ec9-b146-2ae9b76d8a76\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-9w7hc" Mar 19 11:59:36.747738 master-0 kubenswrapper[17644]: I0319 11:59:36.747664 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7tc5\" (UniqueName: \"kubernetes.io/projected/163d6a3d-0080-4122-bb7a-17f6e63f00f0-kube-api-access-m7tc5\") pod \"ingress-operator-66b84d69b-qrjj4\" (UID: \"163d6a3d-0080-4122-bb7a-17f6e63f00f0\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" Mar 19 11:59:36.765683 master-0 kubenswrapper[17644]: I0319 11:59:36.765630 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wls49\" (UniqueName: \"kubernetes.io/projected/22e10648-af7c-409e-b947-570e7d807e05-kube-api-access-wls49\") pod \"dns-operator-9c5679d8f-965np\" (UID: \"22e10648-af7c-409e-b947-570e7d807e05\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-965np" Mar 19 11:59:36.788113 master-0 kubenswrapper[17644]: I0319 11:59:36.788055 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdbjk\" (UniqueName: \"kubernetes.io/projected/f3b6a8b5-bcaa-47f6-a9d5-6186981191d5-kube-api-access-jdbjk\") pod \"migrator-8487694857-jls48\" (UID: \"f3b6a8b5-bcaa-47f6-a9d5-6186981191d5\") " pod="openshift-kube-storage-version-migrator/migrator-8487694857-jls48" Mar 19 11:59:36.808787 master-0 kubenswrapper[17644]: I0319 11:59:36.808683 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p55f\" (UniqueName: \"kubernetes.io/projected/bb22a965-9b36-40cd-993d-747a3978be8e-kube-api-access-5p55f\") pod \"cluster-samples-operator-85f7577d78-ssxxd\" (UID: \"bb22a965-9b36-40cd-993d-747a3978be8e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-ssxxd" Mar 19 11:59:36.829447 master-0 kubenswrapper[17644]: I0319 11:59:36.829309 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqcvx\" (UniqueName: \"kubernetes.io/projected/bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a-kube-api-access-lqcvx\") pod \"multus-additional-cni-plugins-n8vwk\" (UID: \"bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a\") " pod="openshift-multus/multus-additional-cni-plugins-n8vwk" Mar 19 11:59:36.848185 master-0 kubenswrapper[17644]: I0319 11:59:36.848138 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgzdh\" (UniqueName: \"kubernetes.io/projected/d625c81e-01cc-424a-997d-546a5204a72b-kube-api-access-tgzdh\") pod \"csi-snapshot-controller-64854d9cff-764k4\" (UID: \"d625c81e-01cc-424a-997d-546a5204a72b\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" Mar 19 11:59:36.866056 master-0 kubenswrapper[17644]: I0319 11:59:36.866006 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dbmq\" (UniqueName: \"kubernetes.io/projected/52bdf7cc-f07d-487e-937c-6567f194947e-kube-api-access-8dbmq\") pod \"cluster-storage-operator-7d87854d6-htdhf\" (UID: \"52bdf7cc-f07d-487e-937c-6567f194947e\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-htdhf" Mar 19 11:59:36.884944 master-0 kubenswrapper[17644]: E0319 11:59:36.884877 17644 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 11:59:36.884944 master-0 kubenswrapper[17644]: E0319 11:59:36.884924 17644 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 11:59:36.885212 master-0 kubenswrapper[17644]: E0319 11:59:36.885014 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access podName:1c576a88-6da4-43e9-a373-0df27a029f59 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:37.384971509 +0000 UTC m=+11.154929534 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access") pod "installer-3-master-0" (UID: "1c576a88-6da4-43e9-a373-0df27a029f59") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 11:59:36.899112 master-0 kubenswrapper[17644]: E0319 11:59:36.899053 17644 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:59:36.923121 master-0 kubenswrapper[17644]: E0319 11:59:36.923052 17644 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" already exists" pod="openshift-etcd/etcd-master-0" Mar 19 11:59:36.955847 master-0 kubenswrapper[17644]: E0319 11:59:36.955771 17644 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.551s" Mar 19 11:59:36.955847 master-0 kubenswrapper[17644]: I0319 11:59:36.955854 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 19 11:59:36.956142 master-0 kubenswrapper[17644]: I0319 11:59:36.955874 17644 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="b761f07c-53be-436b-b05e-17a554cb94ed" Mar 19 11:59:36.956142 master-0 kubenswrapper[17644]: I0319 11:59:36.955899 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"9de1b7d1e2349a3f400091678dc99b0b7f1f76ada96d88181116adceec1f835c"} Mar 19 11:59:36.956142 master-0 kubenswrapper[17644]: I0319 11:59:36.955977 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:59:36.956142 master-0 kubenswrapper[17644]: I0319 11:59:36.955997 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 19 11:59:36.956142 master-0 kubenswrapper[17644]: I0319 11:59:36.956007 17644 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="b761f07c-53be-436b-b05e-17a554cb94ed" Mar 19 11:59:36.956142 master-0 kubenswrapper[17644]: I0319 11:59:36.956027 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-ng9ss" Mar 19 11:59:36.956142 master-0 kubenswrapper[17644]: I0319 11:59:36.956041 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:59:36.956142 master-0 kubenswrapper[17644]: I0319 11:59:36.956069 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:59:36.956142 master-0 kubenswrapper[17644]: I0319 11:59:36.956096 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:59:36.956465 master-0 kubenswrapper[17644]: I0319 11:59:36.956153 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:59:36.956465 master-0 kubenswrapper[17644]: I0319 11:59:36.956303 17644 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 11:59:36.965442 master-0 kubenswrapper[17644]: I0319 11:59:36.965377 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:59:36.965700 master-0 kubenswrapper[17644]: I0319 11:59:36.965460 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:36.965700 master-0 kubenswrapper[17644]: I0319 11:59:36.965538 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w2fqh" Mar 19 11:59:36.965700 master-0 kubenswrapper[17644]: I0319 11:59:36.965550 17644 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:36.965700 master-0 kubenswrapper[17644]: I0319 11:59:36.965566 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:36.965700 master-0 kubenswrapper[17644]: I0319 11:59:36.965591 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:59:36.965700 master-0 kubenswrapper[17644]: I0319 11:59:36.965624 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:59:36.965700 master-0 kubenswrapper[17644]: I0319 11:59:36.965646 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 11:59:36.965700 master-0 kubenswrapper[17644]: I0319 11:59:36.965685 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:36.966146 master-0 kubenswrapper[17644]: I0319 11:59:36.965795 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:36.966146 master-0 kubenswrapper[17644]: I0319 11:59:36.965871 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w2fqh" Mar 19 11:59:36.966146 master-0 kubenswrapper[17644]: I0319 11:59:36.965900 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:59:36.966146 master-0 kubenswrapper[17644]: I0319 11:59:36.965941 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-ztgjs" Mar 19 11:59:36.966146 master-0 kubenswrapper[17644]: I0319 11:59:36.965963 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:59:36.966146 master-0 kubenswrapper[17644]: I0319 11:59:36.966022 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ztgjs" Mar 19 11:59:36.966146 master-0 kubenswrapper[17644]: I0319 11:59:36.966047 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 11:59:36.966146 master-0 kubenswrapper[17644]: I0319 11:59:36.966063 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-l9sw9" Mar 19 11:59:36.966146 master-0 kubenswrapper[17644]: I0319 11:59:36.966083 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w2fqh" Mar 19 11:59:36.980743 master-0 kubenswrapper[17644]: I0319 11:59:36.980688 17644 scope.go:117] "RemoveContainer" containerID="ceffe432bb3380aafe0729954185b3652b99ca21e97ac6c1e688d47217f36148" Mar 19 11:59:36.990808 master-0 kubenswrapper[17644]: I0319 11:59:36.990772 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:59:36.994686 master-0 kubenswrapper[17644]: I0319 11:59:36.994641 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-bbf67c86c-n58nq" Mar 19 11:59:36.996274 master-0 kubenswrapper[17644]: I0319 11:59:36.996216 17644 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Mar 19 11:59:36.996471 master-0 kubenswrapper[17644]: I0319 11:59:36.996297 17644 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 19 11:59:37.216225 master-0 kubenswrapper[17644]: I0319 11:59:37.210011 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ccbc5" Mar 19 11:59:37.219148 master-0 kubenswrapper[17644]: I0319 11:59:37.219045 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-qrjj4_163d6a3d-0080-4122-bb7a-17f6e63f00f0/ingress-operator/1.log" Mar 19 11:59:37.219772 master-0 kubenswrapper[17644]: I0319 11:59:37.219663 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-qrjj4" event={"ID":"163d6a3d-0080-4122-bb7a-17f6e63f00f0","Type":"ContainerStarted","Data":"f1a369e2c168d3504555cfb36bf2536f4821c23fd8aaab330dd9177e86bb6838"} Mar 19 11:59:37.220331 master-0 kubenswrapper[17644]: I0319 11:59:37.220302 17644 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 11:59:37.395461 master-0 kubenswrapper[17644]: I0319 11:59:37.395336 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h668l" Mar 19 11:59:37.429722 master-0 kubenswrapper[17644]: I0319 11:59:37.429670 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access\") pod \"installer-3-master-0\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:59:37.430041 master-0 kubenswrapper[17644]: E0319 11:59:37.429934 17644 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 11:59:37.430227 master-0 kubenswrapper[17644]: E0319 11:59:37.430211 17644 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 11:59:37.430362 master-0 kubenswrapper[17644]: E0319 11:59:37.430349 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access podName:1c576a88-6da4-43e9-a373-0df27a029f59 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:38.430330646 +0000 UTC m=+12.200288681 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access") pod "installer-3-master-0" (UID: "1c576a88-6da4-43e9-a373-0df27a029f59") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 11:59:37.435452 master-0 kubenswrapper[17644]: I0319 11:59:37.435398 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h668l" Mar 19 11:59:37.567286 master-0 kubenswrapper[17644]: I0319 11:59:37.567218 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:37.792549 master-0 kubenswrapper[17644]: I0319 11:59:37.792464 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:59:37.887964 master-0 kubenswrapper[17644]: I0319 11:59:37.887907 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 19 11:59:37.900941 master-0 kubenswrapper[17644]: I0319 11:59:37.900890 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 19 11:59:38.002517 master-0 kubenswrapper[17644]: I0319 11:59:38.002448 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:59:38.004617 master-0 kubenswrapper[17644]: I0319 11:59:38.004584 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 11:59:38.123173 master-0 kubenswrapper[17644]: I0319 11:59:38.123007 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podStartSLOduration=16.122989848 podStartE2EDuration="16.122989848s" podCreationTimestamp="2026-03-19 11:59:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:59:38.122391403 +0000 UTC m=+11.892349448" watchObservedRunningTime="2026-03-19 11:59:38.122989848 +0000 UTC m=+11.892947883" Mar 19 11:59:38.210686 master-0 kubenswrapper[17644]: I0319 11:59:38.210627 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:38.255118 master-0 kubenswrapper[17644]: I0319 11:59:38.255066 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:38.255470 master-0 kubenswrapper[17644]: I0319 11:59:38.255206 17644 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 11:59:38.255470 master-0 kubenswrapper[17644]: I0319 11:59:38.255216 17644 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 11:59:38.309423 master-0 kubenswrapper[17644]: I0319 11:59:38.309376 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:38.360310 master-0 kubenswrapper[17644]: I0319 11:59:38.360261 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:38.388152 master-0 kubenswrapper[17644]: I0319 11:59:38.387979 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5cb4757995-2scrh"] Mar 19 11:59:38.388451 master-0 kubenswrapper[17644]: E0319 11:59:38.388273 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" Mar 19 11:59:38.388451 master-0 kubenswrapper[17644]: I0319 11:59:38.388287 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" Mar 19 11:59:38.388451 master-0 kubenswrapper[17644]: E0319 11:59:38.388300 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870e66ff-82ed-4c91-8197-dddcb78048c2" containerName="installer" Mar 19 11:59:38.388451 master-0 kubenswrapper[17644]: I0319 11:59:38.388307 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="870e66ff-82ed-4c91-8197-dddcb78048c2" containerName="installer" Mar 19 11:59:38.388451 master-0 kubenswrapper[17644]: E0319 11:59:38.388315 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c13ffb3e-ab50-411c-9208-7ba47e8ebc92" containerName="assisted-installer-controller" Mar 19 11:59:38.388451 master-0 kubenswrapper[17644]: I0319 11:59:38.388323 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="c13ffb3e-ab50-411c-9208-7ba47e8ebc92" containerName="assisted-installer-controller" Mar 19 11:59:38.388451 master-0 kubenswrapper[17644]: E0319 11:59:38.388336 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fac1b46a11e49501805e891baae4a9" containerName="setup" Mar 19 11:59:38.388451 master-0 kubenswrapper[17644]: I0319 11:59:38.388345 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fac1b46a11e49501805e891baae4a9" containerName="setup" Mar 19 11:59:38.388451 master-0 kubenswrapper[17644]: E0319 11:59:38.388353 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bde080b-3820-463f-a27d-9fb9a7843d5d" containerName="installer" Mar 19 11:59:38.388451 master-0 kubenswrapper[17644]: I0319 11:59:38.388360 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bde080b-3820-463f-a27d-9fb9a7843d5d" containerName="installer" Mar 19 11:59:38.388451 master-0 kubenswrapper[17644]: E0319 11:59:38.388373 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e508a43-99db-49eb-bf4e-e3e6a0f49761" containerName="installer" Mar 19 11:59:38.388451 master-0 kubenswrapper[17644]: I0319 11:59:38.388381 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e508a43-99db-49eb-bf4e-e3e6a0f49761" containerName="installer" Mar 19 11:59:38.388451 master-0 kubenswrapper[17644]: E0319 11:59:38.388389 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" Mar 19 11:59:38.388451 master-0 kubenswrapper[17644]: I0319 11:59:38.388397 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" Mar 19 11:59:38.388451 master-0 kubenswrapper[17644]: E0319 11:59:38.388407 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c576a88-6da4-43e9-a373-0df27a029f59" containerName="installer" Mar 19 11:59:38.388451 master-0 kubenswrapper[17644]: I0319 11:59:38.388413 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c576a88-6da4-43e9-a373-0df27a029f59" containerName="installer" Mar 19 11:59:38.388929 master-0 kubenswrapper[17644]: I0319 11:59:38.388502 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bde080b-3820-463f-a27d-9fb9a7843d5d" containerName="installer" Mar 19 11:59:38.388929 master-0 kubenswrapper[17644]: I0319 11:59:38.388521 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" Mar 19 11:59:38.388929 master-0 kubenswrapper[17644]: I0319 11:59:38.388531 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="870e66ff-82ed-4c91-8197-dddcb78048c2" containerName="installer" Mar 19 11:59:38.388929 master-0 kubenswrapper[17644]: I0319 11:59:38.388552 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e508a43-99db-49eb-bf4e-e3e6a0f49761" containerName="installer" Mar 19 11:59:38.388929 master-0 kubenswrapper[17644]: I0319 11:59:38.388564 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="c13ffb3e-ab50-411c-9208-7ba47e8ebc92" containerName="assisted-installer-controller" Mar 19 11:59:38.388929 master-0 kubenswrapper[17644]: I0319 11:59:38.388581 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" Mar 19 11:59:38.388929 master-0 kubenswrapper[17644]: I0319 11:59:38.388594 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fac1b46a11e49501805e891baae4a9" containerName="setup" Mar 19 11:59:38.388929 master-0 kubenswrapper[17644]: I0319 11:59:38.388602 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c576a88-6da4-43e9-a373-0df27a029f59" containerName="installer" Mar 19 11:59:38.389155 master-0 kubenswrapper[17644]: I0319 11:59:38.389048 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.406217 master-0 kubenswrapper[17644]: I0319 11:59:38.406170 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5cb4757995-2scrh"] Mar 19 11:59:38.418970 master-0 kubenswrapper[17644]: I0319 11:59:38.418901 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-sjgm7" Mar 19 11:59:38.453766 master-0 kubenswrapper[17644]: I0319 11:59:38.448400 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access\") pod \"installer-3-master-0\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:59:38.453766 master-0 kubenswrapper[17644]: I0319 11:59:38.452142 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 11:59:38.453766 master-0 kubenswrapper[17644]: E0319 11:59:38.452531 17644 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 11:59:38.453766 master-0 kubenswrapper[17644]: E0319 11:59:38.452552 17644 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 11:59:38.453766 master-0 kubenswrapper[17644]: E0319 11:59:38.452608 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access podName:1c576a88-6da4-43e9-a373-0df27a029f59 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:40.452585561 +0000 UTC m=+14.222543596 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access") pod "installer-3-master-0" (UID: "1c576a88-6da4-43e9-a373-0df27a029f59") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 11:59:38.456268 master-0 kubenswrapper[17644]: I0319 11:59:38.456228 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 11:59:38.484257 master-0 kubenswrapper[17644]: I0319 11:59:38.484192 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 11:59:38.522755 master-0 kubenswrapper[17644]: I0319 11:59:38.517945 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 11:59:38.522755 master-0 kubenswrapper[17644]: I0319 11:59:38.518272 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 11:59:38.536991 master-0 kubenswrapper[17644]: I0319 11:59:38.533708 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 11:59:38.552316 master-0 kubenswrapper[17644]: I0319 11:59:38.552247 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.552316 master-0 kubenswrapper[17644]: I0319 11:59:38.552309 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.552494 master-0 kubenswrapper[17644]: I0319 11:59:38.552367 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.552494 master-0 kubenswrapper[17644]: I0319 11:59:38.552398 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a245a2be-a5d8-4004-99dc-013ae1da116b-audit-dir\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.552494 master-0 kubenswrapper[17644]: I0319 11:59:38.552429 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-router-certs\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.552494 master-0 kubenswrapper[17644]: I0319 11:59:38.552452 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-user-template-error\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.552494 master-0 kubenswrapper[17644]: I0319 11:59:38.552468 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-user-template-login\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.552494 master-0 kubenswrapper[17644]: I0319 11:59:38.552489 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.552666 master-0 kubenswrapper[17644]: I0319 11:59:38.552625 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-session\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.552789 master-0 kubenswrapper[17644]: I0319 11:59:38.552762 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.552830 master-0 kubenswrapper[17644]: I0319 11:59:38.552794 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49qj9\" (UniqueName: \"kubernetes.io/projected/a245a2be-a5d8-4004-99dc-013ae1da116b-kube-api-access-49qj9\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.552868 master-0 kubenswrapper[17644]: I0319 11:59:38.552831 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-service-ca\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.552868 master-0 kubenswrapper[17644]: I0319 11:59:38.552856 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-audit-policies\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.559873 master-0 kubenswrapper[17644]: I0319 11:59:38.557954 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 11:59:38.574296 master-0 kubenswrapper[17644]: I0319 11:59:38.574244 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 11:59:38.593868 master-0 kubenswrapper[17644]: I0319 11:59:38.593812 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 11:59:38.614508 master-0 kubenswrapper[17644]: I0319 11:59:38.614462 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 11:59:38.640671 master-0 kubenswrapper[17644]: I0319 11:59:38.640542 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 11:59:38.653275 master-0 kubenswrapper[17644]: I0319 11:59:38.653242 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 11:59:38.653903 master-0 kubenswrapper[17644]: I0319 11:59:38.653853 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.653973 master-0 kubenswrapper[17644]: I0319 11:59:38.653913 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a245a2be-a5d8-4004-99dc-013ae1da116b-audit-dir\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.654020 master-0 kubenswrapper[17644]: I0319 11:59:38.653996 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a245a2be-a5d8-4004-99dc-013ae1da116b-audit-dir\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.654084 master-0 kubenswrapper[17644]: I0319 11:59:38.654040 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-router-certs\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.654084 master-0 kubenswrapper[17644]: I0319 11:59:38.654074 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-user-template-error\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.654154 master-0 kubenswrapper[17644]: I0319 11:59:38.654098 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-user-template-login\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.654154 master-0 kubenswrapper[17644]: I0319 11:59:38.654121 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.654394 master-0 kubenswrapper[17644]: I0319 11:59:38.654349 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-session\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.654501 master-0 kubenswrapper[17644]: I0319 11:59:38.654475 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.654581 master-0 kubenswrapper[17644]: I0319 11:59:38.654489 17644 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 19 11:59:38.654581 master-0 kubenswrapper[17644]: I0319 11:59:38.654507 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49qj9\" (UniqueName: \"kubernetes.io/projected/a245a2be-a5d8-4004-99dc-013ae1da116b-kube-api-access-49qj9\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.654700 master-0 kubenswrapper[17644]: I0319 11:59:38.654586 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-service-ca\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.654700 master-0 kubenswrapper[17644]: I0319 11:59:38.654645 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-audit-policies\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.654700 master-0 kubenswrapper[17644]: I0319 11:59:38.654689 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.654832 master-0 kubenswrapper[17644]: I0319 11:59:38.654715 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.654886 master-0 kubenswrapper[17644]: E0319 11:59:38.654855 17644 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 19 11:59:38.655062 master-0 kubenswrapper[17644]: E0319 11:59:38.654905 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-cliconfig podName:a245a2be-a5d8-4004-99dc-013ae1da116b nodeName:}" failed. No retries permitted until 2026-03-19 11:59:39.154885933 +0000 UTC m=+12.924843968 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-cliconfig") pod "oauth-openshift-5cb4757995-2scrh" (UID: "a245a2be-a5d8-4004-99dc-013ae1da116b") : configmap "v4-0-config-system-cliconfig" not found Mar 19 11:59:38.655621 master-0 kubenswrapper[17644]: I0319 11:59:38.655586 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-service-ca\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.655989 master-0 kubenswrapper[17644]: I0319 11:59:38.655962 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-audit-policies\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.656222 master-0 kubenswrapper[17644]: I0319 11:59:38.656150 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.658348 master-0 kubenswrapper[17644]: I0319 11:59:38.658284 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.658960 master-0 kubenswrapper[17644]: I0319 11:59:38.658582 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-user-template-error\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.658960 master-0 kubenswrapper[17644]: I0319 11:59:38.658805 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.659659 master-0 kubenswrapper[17644]: I0319 11:59:38.659627 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.662049 master-0 kubenswrapper[17644]: I0319 11:59:38.662025 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-user-template-login\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.664982 master-0 kubenswrapper[17644]: I0319 11:59:38.664926 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-session\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.666115 master-0 kubenswrapper[17644]: I0319 11:59:38.666076 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-router-certs\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.674459 master-0 kubenswrapper[17644]: I0319 11:59:38.674397 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 11:59:38.732411 master-0 kubenswrapper[17644]: I0319 11:59:38.732329 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49qj9\" (UniqueName: \"kubernetes.io/projected/a245a2be-a5d8-4004-99dc-013ae1da116b-kube-api-access-49qj9\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:38.749001 master-0 kubenswrapper[17644]: I0319 11:59:38.748908 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=6.748887692 podStartE2EDuration="6.748887692s" podCreationTimestamp="2026-03-19 11:59:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:59:38.41929367 +0000 UTC m=+12.189251725" watchObservedRunningTime="2026-03-19 11:59:38.748887692 +0000 UTC m=+12.518845727" Mar 19 11:59:38.878309 master-0 kubenswrapper[17644]: I0319 11:59:38.878242 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ccbc5" Mar 19 11:59:39.068893 master-0 kubenswrapper[17644]: I0319 11:59:39.068252 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-b8hzk"] Mar 19 11:59:39.069315 master-0 kubenswrapper[17644]: I0319 11:59:39.069291 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b8hzk" Mar 19 11:59:39.072464 master-0 kubenswrapper[17644]: I0319 11:59:39.072410 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:59:39.078763 master-0 kubenswrapper[17644]: I0319 11:59:39.077557 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 11:59:39.078763 master-0 kubenswrapper[17644]: I0319 11:59:39.077641 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:59:39.084518 master-0 kubenswrapper[17644]: I0319 11:59:39.084471 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b8hzk"] Mar 19 11:59:39.094204 master-0 kubenswrapper[17644]: I0319 11:59:39.094133 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 11:59:39.113889 master-0 kubenswrapper[17644]: I0319 11:59:39.113837 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 11:59:39.162051 master-0 kubenswrapper[17644]: I0319 11:59:39.161972 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f06b4ae-bfd4-465d-b2e2-465cc186cb4b-cert\") pod \"ingress-canary-b8hzk\" (UID: \"7f06b4ae-bfd4-465d-b2e2-465cc186cb4b\") " pod="openshift-ingress-canary/ingress-canary-b8hzk" Mar 19 11:59:39.162300 master-0 kubenswrapper[17644]: I0319 11:59:39.162074 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xsg9\" (UniqueName: \"kubernetes.io/projected/7f06b4ae-bfd4-465d-b2e2-465cc186cb4b-kube-api-access-7xsg9\") pod \"ingress-canary-b8hzk\" (UID: \"7f06b4ae-bfd4-465d-b2e2-465cc186cb4b\") " pod="openshift-ingress-canary/ingress-canary-b8hzk" Mar 19 11:59:39.162300 master-0 kubenswrapper[17644]: I0319 11:59:39.162112 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:39.162976 master-0 kubenswrapper[17644]: I0319 11:59:39.162947 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5cb4757995-2scrh\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:39.235222 master-0 kubenswrapper[17644]: I0319 11:59:39.235170 17644 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 11:59:39.264012 master-0 kubenswrapper[17644]: I0319 11:59:39.263939 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f06b4ae-bfd4-465d-b2e2-465cc186cb4b-cert\") pod \"ingress-canary-b8hzk\" (UID: \"7f06b4ae-bfd4-465d-b2e2-465cc186cb4b\") " pod="openshift-ingress-canary/ingress-canary-b8hzk" Mar 19 11:59:39.264263 master-0 kubenswrapper[17644]: I0319 11:59:39.264065 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xsg9\" (UniqueName: \"kubernetes.io/projected/7f06b4ae-bfd4-465d-b2e2-465cc186cb4b-kube-api-access-7xsg9\") pod \"ingress-canary-b8hzk\" (UID: \"7f06b4ae-bfd4-465d-b2e2-465cc186cb4b\") " pod="openshift-ingress-canary/ingress-canary-b8hzk" Mar 19 11:59:39.264263 master-0 kubenswrapper[17644]: E0319 11:59:39.264084 17644 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 19 11:59:39.264263 master-0 kubenswrapper[17644]: E0319 11:59:39.264155 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f06b4ae-bfd4-465d-b2e2-465cc186cb4b-cert podName:7f06b4ae-bfd4-465d-b2e2-465cc186cb4b nodeName:}" failed. No retries permitted until 2026-03-19 11:59:39.764134616 +0000 UTC m=+13.534092651 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7f06b4ae-bfd4-465d-b2e2-465cc186cb4b-cert") pod "ingress-canary-b8hzk" (UID: "7f06b4ae-bfd4-465d-b2e2-465cc186cb4b") : secret "canary-serving-cert" not found Mar 19 11:59:39.304232 master-0 kubenswrapper[17644]: I0319 11:59:39.304146 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:39.309579 master-0 kubenswrapper[17644]: I0319 11:59:39.309528 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xsg9\" (UniqueName: \"kubernetes.io/projected/7f06b4ae-bfd4-465d-b2e2-465cc186cb4b-kube-api-access-7xsg9\") pod \"ingress-canary-b8hzk\" (UID: \"7f06b4ae-bfd4-465d-b2e2-465cc186cb4b\") " pod="openshift-ingress-canary/ingress-canary-b8hzk" Mar 19 11:59:39.538641 master-0 kubenswrapper[17644]: I0319 11:59:39.538571 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gwt6h" Mar 19 11:59:39.579057 master-0 kubenswrapper[17644]: I0319 11:59:39.578917 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gwt6h" Mar 19 11:59:39.714539 master-0 kubenswrapper[17644]: I0319 11:59:39.714465 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5cb4757995-2scrh"] Mar 19 11:59:39.727750 master-0 kubenswrapper[17644]: I0319 11:59:39.726365 17644 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 11:59:39.765487 master-0 kubenswrapper[17644]: I0319 11:59:39.765429 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:59:39.773936 master-0 kubenswrapper[17644]: I0319 11:59:39.773872 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 11:59:39.774754 master-0 kubenswrapper[17644]: I0319 11:59:39.774684 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f06b4ae-bfd4-465d-b2e2-465cc186cb4b-cert\") pod \"ingress-canary-b8hzk\" (UID: \"7f06b4ae-bfd4-465d-b2e2-465cc186cb4b\") " pod="openshift-ingress-canary/ingress-canary-b8hzk" Mar 19 11:59:39.776577 master-0 kubenswrapper[17644]: E0319 11:59:39.774989 17644 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 19 11:59:39.776577 master-0 kubenswrapper[17644]: E0319 11:59:39.775093 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f06b4ae-bfd4-465d-b2e2-465cc186cb4b-cert podName:7f06b4ae-bfd4-465d-b2e2-465cc186cb4b nodeName:}" failed. No retries permitted until 2026-03-19 11:59:40.775075314 +0000 UTC m=+14.545033349 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7f06b4ae-bfd4-465d-b2e2-465cc186cb4b-cert") pod "ingress-canary-b8hzk" (UID: "7f06b4ae-bfd4-465d-b2e2-465cc186cb4b") : secret "canary-serving-cert" not found Mar 19 11:59:40.244519 master-0 kubenswrapper[17644]: I0319 11:59:40.244461 17644 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 11:59:40.244885 master-0 kubenswrapper[17644]: I0319 11:59:40.244611 17644 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 11:59:40.245711 master-0 kubenswrapper[17644]: I0319 11:59:40.245662 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" event={"ID":"a245a2be-a5d8-4004-99dc-013ae1da116b","Type":"ContainerStarted","Data":"0ff2056a73249521f24ca1126dca54b921ba317a2b76cd12ea17abe274bdf2c2"} Mar 19 11:59:40.255044 master-0 kubenswrapper[17644]: I0319 11:59:40.254965 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-89rdt" Mar 19 11:59:40.261865 master-0 kubenswrapper[17644]: I0319 11:59:40.261519 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-89rdt" Mar 19 11:59:40.408189 master-0 kubenswrapper[17644]: I0319 11:59:40.408119 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:59:40.414335 master-0 kubenswrapper[17644]: I0319 11:59:40.414284 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 11:59:40.496703 master-0 kubenswrapper[17644]: I0319 11:59:40.496560 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access\") pod \"installer-3-master-0\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:59:40.497016 master-0 kubenswrapper[17644]: E0319 11:59:40.496806 17644 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 11:59:40.497016 master-0 kubenswrapper[17644]: E0319 11:59:40.496830 17644 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 11:59:40.497016 master-0 kubenswrapper[17644]: E0319 11:59:40.496886 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access podName:1c576a88-6da4-43e9-a373-0df27a029f59 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:44.496866315 +0000 UTC m=+18.266824350 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access") pod "installer-3-master-0" (UID: "1c576a88-6da4-43e9-a373-0df27a029f59") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 11:59:40.803752 master-0 kubenswrapper[17644]: I0319 11:59:40.803571 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f06b4ae-bfd4-465d-b2e2-465cc186cb4b-cert\") pod \"ingress-canary-b8hzk\" (UID: \"7f06b4ae-bfd4-465d-b2e2-465cc186cb4b\") " pod="openshift-ingress-canary/ingress-canary-b8hzk" Mar 19 11:59:40.804336 master-0 kubenswrapper[17644]: E0319 11:59:40.803861 17644 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 19 11:59:40.804336 master-0 kubenswrapper[17644]: E0319 11:59:40.803982 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f06b4ae-bfd4-465d-b2e2-465cc186cb4b-cert podName:7f06b4ae-bfd4-465d-b2e2-465cc186cb4b nodeName:}" failed. No retries permitted until 2026-03-19 11:59:42.803956143 +0000 UTC m=+16.573914178 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7f06b4ae-bfd4-465d-b2e2-465cc186cb4b-cert") pod "ingress-canary-b8hzk" (UID: "7f06b4ae-bfd4-465d-b2e2-465cc186cb4b") : secret "canary-serving-cert" not found Mar 19 11:59:41.103270 master-0 kubenswrapper[17644]: I0319 11:59:41.103144 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:59:41.108650 master-0 kubenswrapper[17644]: I0319 11:59:41.108558 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:59:41.395706 master-0 kubenswrapper[17644]: I0319 11:59:41.395100 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-899bc59d8-xxr9r" Mar 19 11:59:41.532630 master-0 kubenswrapper[17644]: I0319 11:59:41.532574 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:41.621798 master-0 kubenswrapper[17644]: I0319 11:59:41.621744 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:59:41.625537 master-0 kubenswrapper[17644]: I0319 11:59:41.625323 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-n5gr9" Mar 19 11:59:41.694813 master-0 kubenswrapper[17644]: I0319 11:59:41.694757 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-f67f6868b-chx8j" Mar 19 11:59:41.999027 master-0 kubenswrapper[17644]: I0319 11:59:41.998880 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gwt6h" Mar 19 11:59:42.053066 master-0 kubenswrapper[17644]: I0319 11:59:42.052827 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gwt6h" Mar 19 11:59:42.886333 master-0 kubenswrapper[17644]: I0319 11:59:42.886247 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f06b4ae-bfd4-465d-b2e2-465cc186cb4b-cert\") pod \"ingress-canary-b8hzk\" (UID: \"7f06b4ae-bfd4-465d-b2e2-465cc186cb4b\") " pod="openshift-ingress-canary/ingress-canary-b8hzk" Mar 19 11:59:42.886583 master-0 kubenswrapper[17644]: E0319 11:59:42.886486 17644 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 19 11:59:42.886627 master-0 kubenswrapper[17644]: E0319 11:59:42.886600 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f06b4ae-bfd4-465d-b2e2-465cc186cb4b-cert podName:7f06b4ae-bfd4-465d-b2e2-465cc186cb4b nodeName:}" failed. No retries permitted until 2026-03-19 11:59:46.886572622 +0000 UTC m=+20.656530807 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7f06b4ae-bfd4-465d-b2e2-465cc186cb4b-cert") pod "ingress-canary-b8hzk" (UID: "7f06b4ae-bfd4-465d-b2e2-465cc186cb4b") : secret "canary-serving-cert" not found Mar 19 11:59:42.892913 master-0 kubenswrapper[17644]: I0319 11:59:42.892859 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:59:42.895503 master-0 kubenswrapper[17644]: I0319 11:59:42.895471 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-cr8n7" Mar 19 11:59:42.933608 master-0 kubenswrapper[17644]: I0319 11:59:42.933536 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:59:42.933889 master-0 kubenswrapper[17644]: I0319 11:59:42.933702 17644 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 11:59:42.937447 master-0 kubenswrapper[17644]: I0319 11:59:42.936854 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7dcf5569b5-kpmgt" Mar 19 11:59:43.268485 master-0 kubenswrapper[17644]: I0319 11:59:43.268421 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" event={"ID":"a245a2be-a5d8-4004-99dc-013ae1da116b","Type":"ContainerStarted","Data":"87329a5c60581def1191476a28ca9a42a69fa05d9396af87d913af5505fd2346"} Mar 19 11:59:43.300831 master-0 kubenswrapper[17644]: I0319 11:59:43.300719 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" podStartSLOduration=2.47071976 podStartE2EDuration="5.300691491s" podCreationTimestamp="2026-03-19 11:59:38 +0000 UTC" firstStartedPulling="2026-03-19 11:59:39.725610214 +0000 UTC m=+13.495568249" lastFinishedPulling="2026-03-19 11:59:42.555581955 +0000 UTC m=+16.325539980" observedRunningTime="2026-03-19 11:59:43.299622355 +0000 UTC m=+17.069580410" watchObservedRunningTime="2026-03-19 11:59:43.300691491 +0000 UTC m=+17.070649526" Mar 19 11:59:43.551542 master-0 kubenswrapper[17644]: I0319 11:59:43.549818 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h668l" Mar 19 11:59:43.615387 master-0 kubenswrapper[17644]: I0319 11:59:43.615331 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h668l" Mar 19 11:59:44.274906 master-0 kubenswrapper[17644]: I0319 11:59:44.274807 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:44.283204 master-0 kubenswrapper[17644]: I0319 11:59:44.283126 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 11:59:44.514197 master-0 kubenswrapper[17644]: I0319 11:59:44.514132 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access\") pod \"installer-3-master-0\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:59:44.514439 master-0 kubenswrapper[17644]: E0319 11:59:44.514366 17644 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 11:59:44.514439 master-0 kubenswrapper[17644]: E0319 11:59:44.514407 17644 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 11:59:44.514527 master-0 kubenswrapper[17644]: E0319 11:59:44.514482 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access podName:1c576a88-6da4-43e9-a373-0df27a029f59 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:52.514456762 +0000 UTC m=+26.284414797 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access") pod "installer-3-master-0" (UID: "1c576a88-6da4-43e9-a373-0df27a029f59") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 11:59:44.845808 master-0 kubenswrapper[17644]: I0319 11:59:44.845708 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:59:44.850103 master-0 kubenswrapper[17644]: I0319 11:59:44.850003 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-jq5vq" Mar 19 11:59:46.742773 master-0 kubenswrapper[17644]: I0319 11:59:46.741488 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w2fqh" Mar 19 11:59:46.974195 master-0 kubenswrapper[17644]: I0319 11:59:46.974098 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f06b4ae-bfd4-465d-b2e2-465cc186cb4b-cert\") pod \"ingress-canary-b8hzk\" (UID: \"7f06b4ae-bfd4-465d-b2e2-465cc186cb4b\") " pod="openshift-ingress-canary/ingress-canary-b8hzk" Mar 19 11:59:46.974496 master-0 kubenswrapper[17644]: E0319 11:59:46.974342 17644 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 19 11:59:46.974496 master-0 kubenswrapper[17644]: E0319 11:59:46.974462 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f06b4ae-bfd4-465d-b2e2-465cc186cb4b-cert podName:7f06b4ae-bfd4-465d-b2e2-465cc186cb4b nodeName:}" failed. No retries permitted until 2026-03-19 11:59:54.974431604 +0000 UTC m=+28.744389629 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7f06b4ae-bfd4-465d-b2e2-465cc186cb4b-cert") pod "ingress-canary-b8hzk" (UID: "7f06b4ae-bfd4-465d-b2e2-465cc186cb4b") : secret "canary-serving-cert" not found Mar 19 11:59:47.268487 master-0 kubenswrapper[17644]: I0319 11:59:47.268445 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ccbc5" Mar 19 11:59:47.309319 master-0 kubenswrapper[17644]: I0319 11:59:47.307650 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ccbc5" Mar 19 11:59:47.571127 master-0 kubenswrapper[17644]: I0319 11:59:47.571004 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 11:59:49.347860 master-0 kubenswrapper[17644]: I0319 11:59:49.347790 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:49.348511 master-0 kubenswrapper[17644]: I0319 11:59:49.348017 17644 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 11:59:49.369554 master-0 kubenswrapper[17644]: I0319 11:59:49.369495 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qxkd" Mar 19 11:59:51.669354 master-0 kubenswrapper[17644]: I0319 11:59:51.669233 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 11:59:52.566143 master-0 kubenswrapper[17644]: I0319 11:59:52.565436 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access\") pod \"installer-3-master-0\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 11:59:52.566143 master-0 kubenswrapper[17644]: E0319 11:59:52.565678 17644 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 11:59:52.566143 master-0 kubenswrapper[17644]: E0319 11:59:52.565749 17644 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 11:59:52.566143 master-0 kubenswrapper[17644]: E0319 11:59:52.565817 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access podName:1c576a88-6da4-43e9-a373-0df27a029f59 nodeName:}" failed. No retries permitted until 2026-03-19 12:00:08.565794816 +0000 UTC m=+42.335752841 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access") pod "installer-3-master-0" (UID: "1c576a88-6da4-43e9-a373-0df27a029f59") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 11:59:54.113528 master-0 kubenswrapper[17644]: I0319 11:59:54.113141 17644 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 11:59:54.113528 master-0 kubenswrapper[17644]: I0319 11:59:54.113420 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="8e7a82869988463543d3d8dd1f0b5fe3" containerName="startup-monitor" containerID="cri-o://9b6257f6f7778f2a35b2ae1acf1c50824333ce2495da482e1b0a8f990b61871a" gracePeriod=5 Mar 19 11:59:55.017870 master-0 kubenswrapper[17644]: I0319 11:59:55.017786 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f06b4ae-bfd4-465d-b2e2-465cc186cb4b-cert\") pod \"ingress-canary-b8hzk\" (UID: \"7f06b4ae-bfd4-465d-b2e2-465cc186cb4b\") " pod="openshift-ingress-canary/ingress-canary-b8hzk" Mar 19 11:59:55.018143 master-0 kubenswrapper[17644]: E0319 11:59:55.018050 17644 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 19 11:59:55.018224 master-0 kubenswrapper[17644]: E0319 11:59:55.018191 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f06b4ae-bfd4-465d-b2e2-465cc186cb4b-cert podName:7f06b4ae-bfd4-465d-b2e2-465cc186cb4b nodeName:}" failed. No retries permitted until 2026-03-19 12:00:11.018157769 +0000 UTC m=+44.788115804 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7f06b4ae-bfd4-465d-b2e2-465cc186cb4b-cert") pod "ingress-canary-b8hzk" (UID: "7f06b4ae-bfd4-465d-b2e2-465cc186cb4b") : secret "canary-serving-cert" not found Mar 19 11:59:56.120328 master-0 kubenswrapper[17644]: I0319 11:59:56.120266 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-76b6568d85-8bvjj"] Mar 19 11:59:56.120901 master-0 kubenswrapper[17644]: E0319 11:59:56.120669 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e7a82869988463543d3d8dd1f0b5fe3" containerName="startup-monitor" Mar 19 11:59:56.120901 master-0 kubenswrapper[17644]: I0319 11:59:56.120688 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e7a82869988463543d3d8dd1f0b5fe3" containerName="startup-monitor" Mar 19 11:59:56.121028 master-0 kubenswrapper[17644]: I0319 11:59:56.120912 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e7a82869988463543d3d8dd1f0b5fe3" containerName="startup-monitor" Mar 19 11:59:56.121577 master-0 kubenswrapper[17644]: I0319 11:59:56.121548 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 11:59:56.124055 master-0 kubenswrapper[17644]: I0319 11:59:56.124007 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 11:59:56.124283 master-0 kubenswrapper[17644]: I0319 11:59:56.124254 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 11:59:56.124520 master-0 kubenswrapper[17644]: I0319 11:59:56.124480 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 11:59:56.124750 master-0 kubenswrapper[17644]: I0319 11:59:56.124715 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 11:59:56.124971 master-0 kubenswrapper[17644]: I0319 11:59:56.124944 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 11:59:56.139697 master-0 kubenswrapper[17644]: I0319 11:59:56.139641 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-76b6568d85-8bvjj"] Mar 19 11:59:56.235221 master-0 kubenswrapper[17644]: I0319 11:59:56.235158 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca\") pod \"console-operator-76b6568d85-8bvjj\" (UID: \"d2fd7597-cd7a-4138-bb3c-01681c569bd3\") " pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 11:59:56.235221 master-0 kubenswrapper[17644]: I0319 11:59:56.235231 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-config\") pod \"console-operator-76b6568d85-8bvjj\" (UID: \"d2fd7597-cd7a-4138-bb3c-01681c569bd3\") " pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 11:59:56.235546 master-0 kubenswrapper[17644]: I0319 11:59:56.235290 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clvq7\" (UniqueName: \"kubernetes.io/projected/d2fd7597-cd7a-4138-bb3c-01681c569bd3-kube-api-access-clvq7\") pod \"console-operator-76b6568d85-8bvjj\" (UID: \"d2fd7597-cd7a-4138-bb3c-01681c569bd3\") " pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 11:59:56.235546 master-0 kubenswrapper[17644]: I0319 11:59:56.235347 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2fd7597-cd7a-4138-bb3c-01681c569bd3-serving-cert\") pod \"console-operator-76b6568d85-8bvjj\" (UID: \"d2fd7597-cd7a-4138-bb3c-01681c569bd3\") " pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 11:59:56.336843 master-0 kubenswrapper[17644]: I0319 11:59:56.336757 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2fd7597-cd7a-4138-bb3c-01681c569bd3-serving-cert\") pod \"console-operator-76b6568d85-8bvjj\" (UID: \"d2fd7597-cd7a-4138-bb3c-01681c569bd3\") " pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 11:59:56.336843 master-0 kubenswrapper[17644]: I0319 11:59:56.336821 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca\") pod \"console-operator-76b6568d85-8bvjj\" (UID: \"d2fd7597-cd7a-4138-bb3c-01681c569bd3\") " pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 11:59:56.336843 master-0 kubenswrapper[17644]: I0319 11:59:56.336850 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-config\") pod \"console-operator-76b6568d85-8bvjj\" (UID: \"d2fd7597-cd7a-4138-bb3c-01681c569bd3\") " pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 11:59:56.337192 master-0 kubenswrapper[17644]: I0319 11:59:56.337116 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clvq7\" (UniqueName: \"kubernetes.io/projected/d2fd7597-cd7a-4138-bb3c-01681c569bd3-kube-api-access-clvq7\") pod \"console-operator-76b6568d85-8bvjj\" (UID: \"d2fd7597-cd7a-4138-bb3c-01681c569bd3\") " pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 11:59:56.337710 master-0 kubenswrapper[17644]: E0319 11:59:56.337644 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca podName:d2fd7597-cd7a-4138-bb3c-01681c569bd3 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:56.837594187 +0000 UTC m=+30.607552402 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca") pod "console-operator-76b6568d85-8bvjj" (UID: "d2fd7597-cd7a-4138-bb3c-01681c569bd3") : configmap references non-existent config key: ca-bundle.crt Mar 19 11:59:56.338535 master-0 kubenswrapper[17644]: I0319 11:59:56.338502 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-config\") pod \"console-operator-76b6568d85-8bvjj\" (UID: \"d2fd7597-cd7a-4138-bb3c-01681c569bd3\") " pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 11:59:56.340506 master-0 kubenswrapper[17644]: I0319 11:59:56.340464 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2fd7597-cd7a-4138-bb3c-01681c569bd3-serving-cert\") pod \"console-operator-76b6568d85-8bvjj\" (UID: \"d2fd7597-cd7a-4138-bb3c-01681c569bd3\") " pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 11:59:56.484102 master-0 kubenswrapper[17644]: I0319 11:59:56.484029 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clvq7\" (UniqueName: \"kubernetes.io/projected/d2fd7597-cd7a-4138-bb3c-01681c569bd3-kube-api-access-clvq7\") pod \"console-operator-76b6568d85-8bvjj\" (UID: \"d2fd7597-cd7a-4138-bb3c-01681c569bd3\") " pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 11:59:56.844668 master-0 kubenswrapper[17644]: I0319 11:59:56.844519 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca\") pod \"console-operator-76b6568d85-8bvjj\" (UID: \"d2fd7597-cd7a-4138-bb3c-01681c569bd3\") " pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 11:59:56.845136 master-0 kubenswrapper[17644]: E0319 11:59:56.845104 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca podName:d2fd7597-cd7a-4138-bb3c-01681c569bd3 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:57.84507598 +0000 UTC m=+31.615034065 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca") pod "console-operator-76b6568d85-8bvjj" (UID: "d2fd7597-cd7a-4138-bb3c-01681c569bd3") : configmap references non-existent config key: ca-bundle.crt Mar 19 11:59:57.863138 master-0 kubenswrapper[17644]: I0319 11:59:57.863035 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca\") pod \"console-operator-76b6568d85-8bvjj\" (UID: \"d2fd7597-cd7a-4138-bb3c-01681c569bd3\") " pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 11:59:57.863824 master-0 kubenswrapper[17644]: E0319 11:59:57.863266 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca podName:d2fd7597-cd7a-4138-bb3c-01681c569bd3 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:59.863245664 +0000 UTC m=+33.633203699 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca") pod "console-operator-76b6568d85-8bvjj" (UID: "d2fd7597-cd7a-4138-bb3c-01681c569bd3") : configmap references non-existent config key: ca-bundle.crt Mar 19 11:59:59.256910 master-0 kubenswrapper[17644]: I0319 11:59:59.256857 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_8e7a82869988463543d3d8dd1f0b5fe3/startup-monitor/0.log" Mar 19 11:59:59.257549 master-0 kubenswrapper[17644]: I0319 11:59:59.256945 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:59.382225 master-0 kubenswrapper[17644]: I0319 11:59:59.382031 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") pod \"8e7a82869988463543d3d8dd1f0b5fe3\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " Mar 19 11:59:59.382225 master-0 kubenswrapper[17644]: I0319 11:59:59.382116 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") pod \"8e7a82869988463543d3d8dd1f0b5fe3\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " Mar 19 11:59:59.382225 master-0 kubenswrapper[17644]: I0319 11:59:59.382145 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") pod \"8e7a82869988463543d3d8dd1f0b5fe3\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " Mar 19 11:59:59.382225 master-0 kubenswrapper[17644]: I0319 11:59:59.382185 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests" (OuterVolumeSpecName: "manifests") pod "8e7a82869988463543d3d8dd1f0b5fe3" (UID: "8e7a82869988463543d3d8dd1f0b5fe3"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:59:59.382225 master-0 kubenswrapper[17644]: I0319 11:59:59.382214 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") pod \"8e7a82869988463543d3d8dd1f0b5fe3\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " Mar 19 11:59:59.382225 master-0 kubenswrapper[17644]: I0319 11:59:59.382237 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") pod \"8e7a82869988463543d3d8dd1f0b5fe3\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " Mar 19 11:59:59.382225 master-0 kubenswrapper[17644]: I0319 11:59:59.382240 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock" (OuterVolumeSpecName: "var-lock") pod "8e7a82869988463543d3d8dd1f0b5fe3" (UID: "8e7a82869988463543d3d8dd1f0b5fe3"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:59:59.382671 master-0 kubenswrapper[17644]: I0319 11:59:59.382275 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log" (OuterVolumeSpecName: "var-log") pod "8e7a82869988463543d3d8dd1f0b5fe3" (UID: "8e7a82869988463543d3d8dd1f0b5fe3"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:59:59.382671 master-0 kubenswrapper[17644]: I0319 11:59:59.382346 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "8e7a82869988463543d3d8dd1f0b5fe3" (UID: "8e7a82869988463543d3d8dd1f0b5fe3"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:59:59.382753 master-0 kubenswrapper[17644]: I0319 11:59:59.382669 17644 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") on node \"master-0\" DevicePath \"\"" Mar 19 11:59:59.382753 master-0 kubenswrapper[17644]: I0319 11:59:59.382688 17644 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 11:59:59.382753 master-0 kubenswrapper[17644]: I0319 11:59:59.382702 17644 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") on node \"master-0\" DevicePath \"\"" Mar 19 11:59:59.382753 master-0 kubenswrapper[17644]: I0319 11:59:59.382713 17644 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:59:59.388298 master-0 kubenswrapper[17644]: I0319 11:59:59.388224 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "8e7a82869988463543d3d8dd1f0b5fe3" (UID: "8e7a82869988463543d3d8dd1f0b5fe3"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:59:59.483625 master-0 kubenswrapper[17644]: I0319 11:59:59.483555 17644 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:59:59.631052 master-0 kubenswrapper[17644]: I0319 11:59:59.631002 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_8e7a82869988463543d3d8dd1f0b5fe3/startup-monitor/0.log" Mar 19 11:59:59.631324 master-0 kubenswrapper[17644]: I0319 11:59:59.631065 17644 generic.go:334] "Generic (PLEG): container finished" podID="8e7a82869988463543d3d8dd1f0b5fe3" containerID="9b6257f6f7778f2a35b2ae1acf1c50824333ce2495da482e1b0a8f990b61871a" exitCode=137 Mar 19 11:59:59.631324 master-0 kubenswrapper[17644]: I0319 11:59:59.631131 17644 scope.go:117] "RemoveContainer" containerID="9b6257f6f7778f2a35b2ae1acf1c50824333ce2495da482e1b0a8f990b61871a" Mar 19 11:59:59.631324 master-0 kubenswrapper[17644]: I0319 11:59:59.631140 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 11:59:59.647677 master-0 kubenswrapper[17644]: I0319 11:59:59.647625 17644 scope.go:117] "RemoveContainer" containerID="9b6257f6f7778f2a35b2ae1acf1c50824333ce2495da482e1b0a8f990b61871a" Mar 19 11:59:59.648194 master-0 kubenswrapper[17644]: E0319 11:59:59.648144 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b6257f6f7778f2a35b2ae1acf1c50824333ce2495da482e1b0a8f990b61871a\": container with ID starting with 9b6257f6f7778f2a35b2ae1acf1c50824333ce2495da482e1b0a8f990b61871a not found: ID does not exist" containerID="9b6257f6f7778f2a35b2ae1acf1c50824333ce2495da482e1b0a8f990b61871a" Mar 19 11:59:59.648269 master-0 kubenswrapper[17644]: I0319 11:59:59.648196 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b6257f6f7778f2a35b2ae1acf1c50824333ce2495da482e1b0a8f990b61871a"} err="failed to get container status \"9b6257f6f7778f2a35b2ae1acf1c50824333ce2495da482e1b0a8f990b61871a\": rpc error: code = NotFound desc = could not find container \"9b6257f6f7778f2a35b2ae1acf1c50824333ce2495da482e1b0a8f990b61871a\": container with ID starting with 9b6257f6f7778f2a35b2ae1acf1c50824333ce2495da482e1b0a8f990b61871a not found: ID does not exist" Mar 19 11:59:59.730025 master-0 kubenswrapper[17644]: I0319 11:59:59.729957 17644 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="eae400da-9f9e-4b49-9590-8a69b1bc08cd" Mar 19 11:59:59.890821 master-0 kubenswrapper[17644]: I0319 11:59:59.890580 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca\") pod \"console-operator-76b6568d85-8bvjj\" (UID: \"d2fd7597-cd7a-4138-bb3c-01681c569bd3\") " pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 11:59:59.890821 master-0 kubenswrapper[17644]: E0319 11:59:59.890822 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca podName:d2fd7597-cd7a-4138-bb3c-01681c569bd3 nodeName:}" failed. No retries permitted until 2026-03-19 12:00:03.890793095 +0000 UTC m=+37.660751130 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca") pod "console-operator-76b6568d85-8bvjj" (UID: "d2fd7597-cd7a-4138-bb3c-01681c569bd3") : configmap references non-existent config key: ca-bundle.crt Mar 19 12:00:00.241293 master-0 kubenswrapper[17644]: I0319 12:00:00.241233 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-8688f6945-trnd5"] Mar 19 12:00:00.242610 master-0 kubenswrapper[17644]: I0319 12:00:00.242576 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-8688f6945-trnd5" Mar 19 12:00:00.246020 master-0 kubenswrapper[17644]: I0319 12:00:00.245941 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 19 12:00:00.246989 master-0 kubenswrapper[17644]: I0319 12:00:00.246944 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-zmgxk" Mar 19 12:00:00.263307 master-0 kubenswrapper[17644]: I0319 12:00:00.262715 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-8688f6945-trnd5"] Mar 19 12:00:00.398943 master-0 kubenswrapper[17644]: I0319 12:00:00.398491 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5f747c54-7f5b-4ec9-a16d-7cb13e511f98-monitoring-plugin-cert\") pod \"monitoring-plugin-8688f6945-trnd5\" (UID: \"5f747c54-7f5b-4ec9-a16d-7cb13e511f98\") " pod="openshift-monitoring/monitoring-plugin-8688f6945-trnd5" Mar 19 12:00:00.493748 master-0 kubenswrapper[17644]: I0319 12:00:00.493576 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e7a82869988463543d3d8dd1f0b5fe3" path="/var/lib/kubelet/pods/8e7a82869988463543d3d8dd1f0b5fe3/volumes" Mar 19 12:00:00.493994 master-0 kubenswrapper[17644]: I0319 12:00:00.493865 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="" Mar 19 12:00:00.500141 master-0 kubenswrapper[17644]: I0319 12:00:00.500096 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5f747c54-7f5b-4ec9-a16d-7cb13e511f98-monitoring-plugin-cert\") pod \"monitoring-plugin-8688f6945-trnd5\" (UID: \"5f747c54-7f5b-4ec9-a16d-7cb13e511f98\") " pod="openshift-monitoring/monitoring-plugin-8688f6945-trnd5" Mar 19 12:00:00.503745 master-0 kubenswrapper[17644]: I0319 12:00:00.503554 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5f747c54-7f5b-4ec9-a16d-7cb13e511f98-monitoring-plugin-cert\") pod \"monitoring-plugin-8688f6945-trnd5\" (UID: \"5f747c54-7f5b-4ec9-a16d-7cb13e511f98\") " pod="openshift-monitoring/monitoring-plugin-8688f6945-trnd5" Mar 19 12:00:00.509902 master-0 kubenswrapper[17644]: I0319 12:00:00.509848 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 12:00:00.510017 master-0 kubenswrapper[17644]: I0319 12:00:00.509897 17644 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="eae400da-9f9e-4b49-9590-8a69b1bc08cd" Mar 19 12:00:00.512256 master-0 kubenswrapper[17644]: I0319 12:00:00.512218 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 12:00:00.512256 master-0 kubenswrapper[17644]: I0319 12:00:00.512248 17644 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="eae400da-9f9e-4b49-9590-8a69b1bc08cd" Mar 19 12:00:00.571294 master-0 kubenswrapper[17644]: I0319 12:00:00.571231 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-8688f6945-trnd5" Mar 19 12:00:00.865933 master-0 kubenswrapper[17644]: I0319 12:00:00.863835 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-5cb4757995-2scrh"] Mar 19 12:00:00.991552 master-0 kubenswrapper[17644]: I0319 12:00:00.988521 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-8688f6945-trnd5"] Mar 19 12:00:01.538146 master-0 kubenswrapper[17644]: I0319 12:00:01.538088 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 12:00:01.542650 master-0 kubenswrapper[17644]: I0319 12:00:01.542586 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 12:00:01.647549 master-0 kubenswrapper[17644]: I0319 12:00:01.647475 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-8688f6945-trnd5" event={"ID":"5f747c54-7f5b-4ec9-a16d-7cb13e511f98","Type":"ContainerStarted","Data":"e9cbbce21e78d2f3fa5dc18e35f678f40a16f326436d7fc06c81b7d9ddbe9ddb"} Mar 19 12:00:03.957159 master-0 kubenswrapper[17644]: I0319 12:00:03.957094 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca\") pod \"console-operator-76b6568d85-8bvjj\" (UID: \"d2fd7597-cd7a-4138-bb3c-01681c569bd3\") " pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 12:00:03.957812 master-0 kubenswrapper[17644]: E0319 12:00:03.957292 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca podName:d2fd7597-cd7a-4138-bb3c-01681c569bd3 nodeName:}" failed. No retries permitted until 2026-03-19 12:00:11.957272839 +0000 UTC m=+45.727230874 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca") pod "console-operator-76b6568d85-8bvjj" (UID: "d2fd7597-cd7a-4138-bb3c-01681c569bd3") : configmap references non-existent config key: ca-bundle.crt Mar 19 12:00:04.669639 master-0 kubenswrapper[17644]: I0319 12:00:04.669558 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-8688f6945-trnd5" event={"ID":"5f747c54-7f5b-4ec9-a16d-7cb13e511f98","Type":"ContainerStarted","Data":"1e277bdf4bbacbc4c8a7951052568a9d5cb0b455d4403de87fe314d95e164a90"} Mar 19 12:00:04.670038 master-0 kubenswrapper[17644]: I0319 12:00:04.669997 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-8688f6945-trnd5" Mar 19 12:00:04.676014 master-0 kubenswrapper[17644]: I0319 12:00:04.675961 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-8688f6945-trnd5" Mar 19 12:00:04.713818 master-0 kubenswrapper[17644]: I0319 12:00:04.713722 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-8688f6945-trnd5" podStartSLOduration=2.163360523 podStartE2EDuration="4.713700485s" podCreationTimestamp="2026-03-19 12:00:00 +0000 UTC" firstStartedPulling="2026-03-19 12:00:01.007193453 +0000 UTC m=+34.777151478" lastFinishedPulling="2026-03-19 12:00:03.557533385 +0000 UTC m=+37.327491440" observedRunningTime="2026-03-19 12:00:04.693485595 +0000 UTC m=+38.463443630" watchObservedRunningTime="2026-03-19 12:00:04.713700485 +0000 UTC m=+38.483658520" Mar 19 12:00:08.628316 master-0 kubenswrapper[17644]: I0319 12:00:08.628212 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access\") pod \"installer-3-master-0\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 12:00:08.629198 master-0 kubenswrapper[17644]: E0319 12:00:08.628528 17644 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 12:00:08.629198 master-0 kubenswrapper[17644]: E0319 12:00:08.628589 17644 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 12:00:08.629198 master-0 kubenswrapper[17644]: E0319 12:00:08.628665 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access podName:1c576a88-6da4-43e9-a373-0df27a029f59 nodeName:}" failed. No retries permitted until 2026-03-19 12:00:40.628637849 +0000 UTC m=+74.398595884 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access") pod "installer-3-master-0" (UID: "1c576a88-6da4-43e9-a373-0df27a029f59") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 12:00:11.310513 master-0 kubenswrapper[17644]: I0319 12:00:11.310420 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f06b4ae-bfd4-465d-b2e2-465cc186cb4b-cert\") pod \"ingress-canary-b8hzk\" (UID: \"7f06b4ae-bfd4-465d-b2e2-465cc186cb4b\") " pod="openshift-ingress-canary/ingress-canary-b8hzk" Mar 19 12:00:12.210666 master-0 kubenswrapper[17644]: I0319 12:00:12.210612 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca\") pod \"console-operator-76b6568d85-8bvjj\" (UID: \"d2fd7597-cd7a-4138-bb3c-01681c569bd3\") " pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 12:00:12.212184 master-0 kubenswrapper[17644]: E0319 12:00:12.211389 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca podName:d2fd7597-cd7a-4138-bb3c-01681c569bd3 nodeName:}" failed. No retries permitted until 2026-03-19 12:00:28.211363821 +0000 UTC m=+61.981321856 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca") pod "console-operator-76b6568d85-8bvjj" (UID: "d2fd7597-cd7a-4138-bb3c-01681c569bd3") : configmap references non-existent config key: ca-bundle.crt Mar 19 12:00:12.226217 master-0 kubenswrapper[17644]: I0319 12:00:12.226135 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f06b4ae-bfd4-465d-b2e2-465cc186cb4b-cert\") pod \"ingress-canary-b8hzk\" (UID: \"7f06b4ae-bfd4-465d-b2e2-465cc186cb4b\") " pod="openshift-ingress-canary/ingress-canary-b8hzk" Mar 19 12:00:12.394707 master-0 kubenswrapper[17644]: I0319 12:00:12.394637 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b8hzk" Mar 19 12:00:12.818244 master-0 kubenswrapper[17644]: I0319 12:00:12.818115 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b8hzk"] Mar 19 12:00:12.830824 master-0 kubenswrapper[17644]: W0319 12:00:12.830789 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f06b4ae_bfd4_465d_b2e2_465cc186cb4b.slice/crio-f9cb09d08ac9c67e5a65851c11a4beb82308ea13fa086c47e6c1f2d2824ade3e WatchSource:0}: Error finding container f9cb09d08ac9c67e5a65851c11a4beb82308ea13fa086c47e6c1f2d2824ade3e: Status 404 returned error can't find the container with id f9cb09d08ac9c67e5a65851c11a4beb82308ea13fa086c47e6c1f2d2824ade3e Mar 19 12:00:13.243168 master-0 kubenswrapper[17644]: I0319 12:00:13.243087 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b8hzk" event={"ID":"7f06b4ae-bfd4-465d-b2e2-465cc186cb4b","Type":"ContainerStarted","Data":"15e089d69e14632abcf1cb037b728960543325aa97ee6d98b5ea9220d0be8fed"} Mar 19 12:00:13.243168 master-0 kubenswrapper[17644]: I0319 12:00:13.243164 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b8hzk" event={"ID":"7f06b4ae-bfd4-465d-b2e2-465cc186cb4b","Type":"ContainerStarted","Data":"f9cb09d08ac9c67e5a65851c11a4beb82308ea13fa086c47e6c1f2d2824ade3e"} Mar 19 12:00:13.260082 master-0 kubenswrapper[17644]: I0319 12:00:13.259997 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-b8hzk" podStartSLOduration=34.259976115 podStartE2EDuration="34.259976115s" podCreationTimestamp="2026-03-19 11:59:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:00:13.25733943 +0000 UTC m=+47.027297475" watchObservedRunningTime="2026-03-19 12:00:13.259976115 +0000 UTC m=+47.029934150" Mar 19 12:00:25.897707 master-0 kubenswrapper[17644]: I0319 12:00:25.897593 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" podUID="a245a2be-a5d8-4004-99dc-013ae1da116b" containerName="oauth-openshift" containerID="cri-o://87329a5c60581def1191476a28ca9a42a69fa05d9396af87d913af5505fd2346" gracePeriod=15 Mar 19 12:00:26.332533 master-0 kubenswrapper[17644]: I0319 12:00:26.332476 17644 generic.go:334] "Generic (PLEG): container finished" podID="a245a2be-a5d8-4004-99dc-013ae1da116b" containerID="87329a5c60581def1191476a28ca9a42a69fa05d9396af87d913af5505fd2346" exitCode=0 Mar 19 12:00:26.332533 master-0 kubenswrapper[17644]: I0319 12:00:26.332530 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" event={"ID":"a245a2be-a5d8-4004-99dc-013ae1da116b","Type":"ContainerDied","Data":"87329a5c60581def1191476a28ca9a42a69fa05d9396af87d913af5505fd2346"} Mar 19 12:00:26.443577 master-0 kubenswrapper[17644]: I0319 12:00:26.443497 17644 scope.go:117] "RemoveContainer" containerID="21c17e15f1723f8eb75ec60f42ebd73c793697e640249886764928c881dbaaa1" Mar 19 12:00:26.475693 master-0 kubenswrapper[17644]: I0319 12:00:26.475638 17644 scope.go:117] "RemoveContainer" containerID="d3ab6ca62e19d2ef407ccf237743444ad88802357f607cafd2e5c6b8ac29d477" Mar 19 12:00:26.482377 master-0 kubenswrapper[17644]: I0319 12:00:26.477844 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 12:00:26.521158 master-0 kubenswrapper[17644]: I0319 12:00:26.521090 17644 scope.go:117] "RemoveContainer" containerID="39c756c5e9204811d8c83cfa45ff7447029413f92b87a61b82da1dc41e1a076d" Mar 19 12:00:26.539538 master-0 kubenswrapper[17644]: I0319 12:00:26.537246 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-847fb46dcc-qwvn8"] Mar 19 12:00:26.539538 master-0 kubenswrapper[17644]: E0319 12:00:26.537661 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a245a2be-a5d8-4004-99dc-013ae1da116b" containerName="oauth-openshift" Mar 19 12:00:26.539538 master-0 kubenswrapper[17644]: I0319 12:00:26.537678 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="a245a2be-a5d8-4004-99dc-013ae1da116b" containerName="oauth-openshift" Mar 19 12:00:26.539538 master-0 kubenswrapper[17644]: I0319 12:00:26.538529 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="a245a2be-a5d8-4004-99dc-013ae1da116b" containerName="oauth-openshift" Mar 19 12:00:26.540224 master-0 kubenswrapper[17644]: I0319 12:00:26.539921 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.558498 master-0 kubenswrapper[17644]: I0319 12:00:26.558444 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-847fb46dcc-qwvn8"] Mar 19 12:00:26.624345 master-0 kubenswrapper[17644]: I0319 12:00:26.624288 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-service-ca\") pod \"a245a2be-a5d8-4004-99dc-013ae1da116b\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " Mar 19 12:00:26.624459 master-0 kubenswrapper[17644]: I0319 12:00:26.624396 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49qj9\" (UniqueName: \"kubernetes.io/projected/a245a2be-a5d8-4004-99dc-013ae1da116b-kube-api-access-49qj9\") pod \"a245a2be-a5d8-4004-99dc-013ae1da116b\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " Mar 19 12:00:26.624459 master-0 kubenswrapper[17644]: I0319 12:00:26.624435 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-cliconfig\") pod \"a245a2be-a5d8-4004-99dc-013ae1da116b\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " Mar 19 12:00:26.624561 master-0 kubenswrapper[17644]: I0319 12:00:26.624470 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-serving-cert\") pod \"a245a2be-a5d8-4004-99dc-013ae1da116b\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " Mar 19 12:00:26.624603 master-0 kubenswrapper[17644]: I0319 12:00:26.624564 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-user-template-error\") pod \"a245a2be-a5d8-4004-99dc-013ae1da116b\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " Mar 19 12:00:26.624635 master-0 kubenswrapper[17644]: I0319 12:00:26.624615 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a245a2be-a5d8-4004-99dc-013ae1da116b-audit-dir\") pod \"a245a2be-a5d8-4004-99dc-013ae1da116b\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " Mar 19 12:00:26.624695 master-0 kubenswrapper[17644]: I0319 12:00:26.624670 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-trusted-ca-bundle\") pod \"a245a2be-a5d8-4004-99dc-013ae1da116b\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " Mar 19 12:00:26.624758 master-0 kubenswrapper[17644]: I0319 12:00:26.624699 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-router-certs\") pod \"a245a2be-a5d8-4004-99dc-013ae1da116b\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " Mar 19 12:00:26.624758 master-0 kubenswrapper[17644]: I0319 12:00:26.624718 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-audit-policies\") pod \"a245a2be-a5d8-4004-99dc-013ae1da116b\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " Mar 19 12:00:26.624758 master-0 kubenswrapper[17644]: I0319 12:00:26.624751 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-user-template-login\") pod \"a245a2be-a5d8-4004-99dc-013ae1da116b\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " Mar 19 12:00:26.624873 master-0 kubenswrapper[17644]: I0319 12:00:26.624776 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-session\") pod \"a245a2be-a5d8-4004-99dc-013ae1da116b\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " Mar 19 12:00:26.624873 master-0 kubenswrapper[17644]: I0319 12:00:26.624803 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-ocp-branding-template\") pod \"a245a2be-a5d8-4004-99dc-013ae1da116b\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " Mar 19 12:00:26.624873 master-0 kubenswrapper[17644]: I0319 12:00:26.624827 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-user-template-provider-selection\") pod \"a245a2be-a5d8-4004-99dc-013ae1da116b\" (UID: \"a245a2be-a5d8-4004-99dc-013ae1da116b\") " Mar 19 12:00:26.627749 master-0 kubenswrapper[17644]: I0319 12:00:26.627698 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "a245a2be-a5d8-4004-99dc-013ae1da116b" (UID: "a245a2be-a5d8-4004-99dc-013ae1da116b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:00:26.628181 master-0 kubenswrapper[17644]: I0319 12:00:26.628141 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "a245a2be-a5d8-4004-99dc-013ae1da116b" (UID: "a245a2be-a5d8-4004-99dc-013ae1da116b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:00:26.628618 master-0 kubenswrapper[17644]: I0319 12:00:26.628551 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "a245a2be-a5d8-4004-99dc-013ae1da116b" (UID: "a245a2be-a5d8-4004-99dc-013ae1da116b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:00:26.629034 master-0 kubenswrapper[17644]: I0319 12:00:26.629001 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "a245a2be-a5d8-4004-99dc-013ae1da116b" (UID: "a245a2be-a5d8-4004-99dc-013ae1da116b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:00:26.630139 master-0 kubenswrapper[17644]: I0319 12:00:26.630088 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a245a2be-a5d8-4004-99dc-013ae1da116b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "a245a2be-a5d8-4004-99dc-013ae1da116b" (UID: "a245a2be-a5d8-4004-99dc-013ae1da116b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:00:26.631420 master-0 kubenswrapper[17644]: I0319 12:00:26.631359 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "a245a2be-a5d8-4004-99dc-013ae1da116b" (UID: "a245a2be-a5d8-4004-99dc-013ae1da116b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:00:26.632144 master-0 kubenswrapper[17644]: I0319 12:00:26.632115 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a245a2be-a5d8-4004-99dc-013ae1da116b-kube-api-access-49qj9" (OuterVolumeSpecName: "kube-api-access-49qj9") pod "a245a2be-a5d8-4004-99dc-013ae1da116b" (UID: "a245a2be-a5d8-4004-99dc-013ae1da116b"). InnerVolumeSpecName "kube-api-access-49qj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:00:26.632689 master-0 kubenswrapper[17644]: I0319 12:00:26.632594 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "a245a2be-a5d8-4004-99dc-013ae1da116b" (UID: "a245a2be-a5d8-4004-99dc-013ae1da116b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:00:26.633315 master-0 kubenswrapper[17644]: I0319 12:00:26.633235 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "a245a2be-a5d8-4004-99dc-013ae1da116b" (UID: "a245a2be-a5d8-4004-99dc-013ae1da116b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:00:26.633871 master-0 kubenswrapper[17644]: I0319 12:00:26.633815 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "a245a2be-a5d8-4004-99dc-013ae1da116b" (UID: "a245a2be-a5d8-4004-99dc-013ae1da116b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:00:26.634133 master-0 kubenswrapper[17644]: I0319 12:00:26.634026 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "a245a2be-a5d8-4004-99dc-013ae1da116b" (UID: "a245a2be-a5d8-4004-99dc-013ae1da116b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:00:26.634185 master-0 kubenswrapper[17644]: I0319 12:00:26.634154 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "a245a2be-a5d8-4004-99dc-013ae1da116b" (UID: "a245a2be-a5d8-4004-99dc-013ae1da116b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:00:26.635371 master-0 kubenswrapper[17644]: I0319 12:00:26.635316 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "a245a2be-a5d8-4004-99dc-013ae1da116b" (UID: "a245a2be-a5d8-4004-99dc-013ae1da116b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:00:26.726561 master-0 kubenswrapper[17644]: I0319 12:00:26.726471 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-router-certs\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.726810 master-0 kubenswrapper[17644]: I0319 12:00:26.726593 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e43eccd-712d-459a-92af-5c2e900409c0-audit-dir\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.726810 master-0 kubenswrapper[17644]: I0319 12:00:26.726624 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-session\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.726810 master-0 kubenswrapper[17644]: I0319 12:00:26.726653 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.726931 master-0 kubenswrapper[17644]: I0319 12:00:26.726789 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-user-template-login\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.726931 master-0 kubenswrapper[17644]: I0319 12:00:26.726887 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.726931 master-0 kubenswrapper[17644]: I0319 12:00:26.726915 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-user-template-error\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.727165 master-0 kubenswrapper[17644]: I0319 12:00:26.727087 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.727230 master-0 kubenswrapper[17644]: I0319 12:00:26.727214 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.727327 master-0 kubenswrapper[17644]: I0319 12:00:26.727294 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3e43eccd-712d-459a-92af-5c2e900409c0-audit-policies\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.727415 master-0 kubenswrapper[17644]: I0319 12:00:26.727385 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqp8z\" (UniqueName: \"kubernetes.io/projected/3e43eccd-712d-459a-92af-5c2e900409c0-kube-api-access-vqp8z\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.727506 master-0 kubenswrapper[17644]: I0319 12:00:26.727481 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-service-ca\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.727671 master-0 kubenswrapper[17644]: I0319 12:00:26.727636 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.727954 master-0 kubenswrapper[17644]: I0319 12:00:26.727885 17644 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:00:26.727954 master-0 kubenswrapper[17644]: I0319 12:00:26.727952 17644 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Mar 19 12:00:26.728044 master-0 kubenswrapper[17644]: I0319 12:00:26.727974 17644 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a245a2be-a5d8-4004-99dc-013ae1da116b-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:00:26.728044 master-0 kubenswrapper[17644]: I0319 12:00:26.727992 17644 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:00:26.728044 master-0 kubenswrapper[17644]: I0319 12:00:26.728007 17644 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:00:26.728044 master-0 kubenswrapper[17644]: I0319 12:00:26.728021 17644 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 19 12:00:26.728044 master-0 kubenswrapper[17644]: I0319 12:00:26.728033 17644 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Mar 19 12:00:26.728044 master-0 kubenswrapper[17644]: I0319 12:00:26.728048 17644 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Mar 19 12:00:26.728207 master-0 kubenswrapper[17644]: I0319 12:00:26.728061 17644 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Mar 19 12:00:26.728207 master-0 kubenswrapper[17644]: I0319 12:00:26.728073 17644 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Mar 19 12:00:26.728207 master-0 kubenswrapper[17644]: I0319 12:00:26.728088 17644 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 12:00:26.728207 master-0 kubenswrapper[17644]: I0319 12:00:26.728101 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49qj9\" (UniqueName: \"kubernetes.io/projected/a245a2be-a5d8-4004-99dc-013ae1da116b-kube-api-access-49qj9\") on node \"master-0\" DevicePath \"\"" Mar 19 12:00:26.728207 master-0 kubenswrapper[17644]: I0319 12:00:26.728111 17644 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a245a2be-a5d8-4004-99dc-013ae1da116b-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Mar 19 12:00:26.830507 master-0 kubenswrapper[17644]: I0319 12:00:26.830439 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-router-certs\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.830507 master-0 kubenswrapper[17644]: I0319 12:00:26.830506 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e43eccd-712d-459a-92af-5c2e900409c0-audit-dir\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.830824 master-0 kubenswrapper[17644]: I0319 12:00:26.830532 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-session\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.830824 master-0 kubenswrapper[17644]: I0319 12:00:26.830555 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.830824 master-0 kubenswrapper[17644]: I0319 12:00:26.830588 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-user-template-login\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.830824 master-0 kubenswrapper[17644]: I0319 12:00:26.830609 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.830824 master-0 kubenswrapper[17644]: I0319 12:00:26.830629 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-user-template-error\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.830824 master-0 kubenswrapper[17644]: I0319 12:00:26.830658 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.830824 master-0 kubenswrapper[17644]: I0319 12:00:26.830683 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.830824 master-0 kubenswrapper[17644]: I0319 12:00:26.830706 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3e43eccd-712d-459a-92af-5c2e900409c0-audit-policies\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.830824 master-0 kubenswrapper[17644]: I0319 12:00:26.830751 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqp8z\" (UniqueName: \"kubernetes.io/projected/3e43eccd-712d-459a-92af-5c2e900409c0-kube-api-access-vqp8z\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.830824 master-0 kubenswrapper[17644]: I0319 12:00:26.830802 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-service-ca\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.831273 master-0 kubenswrapper[17644]: I0319 12:00:26.830850 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.831533 master-0 kubenswrapper[17644]: I0319 12:00:26.831487 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e43eccd-712d-459a-92af-5c2e900409c0-audit-dir\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.835753 master-0 kubenswrapper[17644]: I0319 12:00:26.832438 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.835753 master-0 kubenswrapper[17644]: I0319 12:00:26.832640 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.835753 master-0 kubenswrapper[17644]: I0319 12:00:26.833938 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-session\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.835753 master-0 kubenswrapper[17644]: I0319 12:00:26.834627 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-router-certs\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.835753 master-0 kubenswrapper[17644]: I0319 12:00:26.835148 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-user-template-login\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.835954 master-0 kubenswrapper[17644]: I0319 12:00:26.835770 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.835954 master-0 kubenswrapper[17644]: I0319 12:00:26.835834 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.836473 master-0 kubenswrapper[17644]: I0319 12:00:26.836441 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.836533 master-0 kubenswrapper[17644]: I0319 12:00:26.836447 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-user-template-error\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.836606 master-0 kubenswrapper[17644]: I0319 12:00:26.836577 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-service-ca\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.836823 master-0 kubenswrapper[17644]: I0319 12:00:26.836799 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3e43eccd-712d-459a-92af-5c2e900409c0-audit-policies\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.847818 master-0 kubenswrapper[17644]: I0319 12:00:26.847699 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqp8z\" (UniqueName: \"kubernetes.io/projected/3e43eccd-712d-459a-92af-5c2e900409c0-kube-api-access-vqp8z\") pod \"oauth-openshift-847fb46dcc-qwvn8\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:26.876992 master-0 kubenswrapper[17644]: I0319 12:00:26.876847 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:27.340098 master-0 kubenswrapper[17644]: I0319 12:00:27.340032 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" event={"ID":"a245a2be-a5d8-4004-99dc-013ae1da116b","Type":"ContainerDied","Data":"0ff2056a73249521f24ca1126dca54b921ba317a2b76cd12ea17abe274bdf2c2"} Mar 19 12:00:27.340098 master-0 kubenswrapper[17644]: I0319 12:00:27.340106 17644 scope.go:117] "RemoveContainer" containerID="87329a5c60581def1191476a28ca9a42a69fa05d9396af87d913af5505fd2346" Mar 19 12:00:27.340725 master-0 kubenswrapper[17644]: I0319 12:00:27.340106 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5cb4757995-2scrh" Mar 19 12:00:27.500566 master-0 kubenswrapper[17644]: I0319 12:00:27.500424 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-847fb46dcc-qwvn8"] Mar 19 12:00:27.508105 master-0 kubenswrapper[17644]: W0319 12:00:27.508057 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e43eccd_712d_459a_92af_5c2e900409c0.slice/crio-1081c0cd1af59e4de9012cc7cf2706812e9d825c27f618f134714732b6230dda WatchSource:0}: Error finding container 1081c0cd1af59e4de9012cc7cf2706812e9d825c27f618f134714732b6230dda: Status 404 returned error can't find the container with id 1081c0cd1af59e4de9012cc7cf2706812e9d825c27f618f134714732b6230dda Mar 19 12:00:27.537441 master-0 kubenswrapper[17644]: I0319 12:00:27.537392 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-5cb4757995-2scrh"] Mar 19 12:00:27.555864 master-0 kubenswrapper[17644]: I0319 12:00:27.554425 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-5cb4757995-2scrh"] Mar 19 12:00:28.272161 master-0 kubenswrapper[17644]: I0319 12:00:28.272022 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca\") pod \"console-operator-76b6568d85-8bvjj\" (UID: \"d2fd7597-cd7a-4138-bb3c-01681c569bd3\") " pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 12:00:28.272343 master-0 kubenswrapper[17644]: E0319 12:00:28.272177 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca podName:d2fd7597-cd7a-4138-bb3c-01681c569bd3 nodeName:}" failed. No retries permitted until 2026-03-19 12:01:00.272163636 +0000 UTC m=+94.042121671 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca") pod "console-operator-76b6568d85-8bvjj" (UID: "d2fd7597-cd7a-4138-bb3c-01681c569bd3") : configmap references non-existent config key: ca-bundle.crt Mar 19 12:00:28.350782 master-0 kubenswrapper[17644]: I0319 12:00:28.350703 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" event={"ID":"3e43eccd-712d-459a-92af-5c2e900409c0","Type":"ContainerStarted","Data":"04f45dfb302524bd6eb9768c32c0f5f01aa67d750a9b740fc57bde47ed7c9d45"} Mar 19 12:00:28.350782 master-0 kubenswrapper[17644]: I0319 12:00:28.350764 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" event={"ID":"3e43eccd-712d-459a-92af-5c2e900409c0","Type":"ContainerStarted","Data":"1081c0cd1af59e4de9012cc7cf2706812e9d825c27f618f134714732b6230dda"} Mar 19 12:00:28.351658 master-0 kubenswrapper[17644]: I0319 12:00:28.351608 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:28.365752 master-0 kubenswrapper[17644]: I0319 12:00:28.365680 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:00:28.397565 master-0 kubenswrapper[17644]: I0319 12:00:28.397447 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" podStartSLOduration=28.397416657 podStartE2EDuration="28.397416657s" podCreationTimestamp="2026-03-19 12:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:00:28.372623375 +0000 UTC m=+62.142581420" watchObservedRunningTime="2026-03-19 12:00:28.397416657 +0000 UTC m=+62.167374702" Mar 19 12:00:28.497757 master-0 kubenswrapper[17644]: I0319 12:00:28.496892 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a245a2be-a5d8-4004-99dc-013ae1da116b" path="/var/lib/kubelet/pods/a245a2be-a5d8-4004-99dc-013ae1da116b/volumes" Mar 19 12:00:30.873959 master-0 kubenswrapper[17644]: I0319 12:00:30.872167 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 19 12:00:30.873959 master-0 kubenswrapper[17644]: I0319 12:00:30.873238 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:00:30.878332 master-0 kubenswrapper[17644]: I0319 12:00:30.878259 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-kpv7f" Mar 19 12:00:30.878707 master-0 kubenswrapper[17644]: I0319 12:00:30.878595 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 12:00:30.914986 master-0 kubenswrapper[17644]: I0319 12:00:30.914892 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5cc396b6-f9af-464a-a2fb-1376bc1400a9-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"5cc396b6-f9af-464a-a2fb-1376bc1400a9\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:00:30.915295 master-0 kubenswrapper[17644]: I0319 12:00:30.915113 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5cc396b6-f9af-464a-a2fb-1376bc1400a9-var-lock\") pod \"installer-4-master-0\" (UID: \"5cc396b6-f9af-464a-a2fb-1376bc1400a9\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:00:30.915295 master-0 kubenswrapper[17644]: I0319 12:00:30.915273 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cc396b6-f9af-464a-a2fb-1376bc1400a9-kube-api-access\") pod \"installer-4-master-0\" (UID: \"5cc396b6-f9af-464a-a2fb-1376bc1400a9\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:00:30.915964 master-0 kubenswrapper[17644]: I0319 12:00:30.915899 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 19 12:00:31.017206 master-0 kubenswrapper[17644]: I0319 12:00:31.017081 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5cc396b6-f9af-464a-a2fb-1376bc1400a9-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"5cc396b6-f9af-464a-a2fb-1376bc1400a9\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:00:31.017713 master-0 kubenswrapper[17644]: I0319 12:00:31.017280 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5cc396b6-f9af-464a-a2fb-1376bc1400a9-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"5cc396b6-f9af-464a-a2fb-1376bc1400a9\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:00:31.017713 master-0 kubenswrapper[17644]: I0319 12:00:31.017328 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5cc396b6-f9af-464a-a2fb-1376bc1400a9-var-lock\") pod \"installer-4-master-0\" (UID: \"5cc396b6-f9af-464a-a2fb-1376bc1400a9\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:00:31.017713 master-0 kubenswrapper[17644]: I0319 12:00:31.017384 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5cc396b6-f9af-464a-a2fb-1376bc1400a9-var-lock\") pod \"installer-4-master-0\" (UID: \"5cc396b6-f9af-464a-a2fb-1376bc1400a9\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:00:31.017713 master-0 kubenswrapper[17644]: I0319 12:00:31.017498 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cc396b6-f9af-464a-a2fb-1376bc1400a9-kube-api-access\") pod \"installer-4-master-0\" (UID: \"5cc396b6-f9af-464a-a2fb-1376bc1400a9\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:00:31.039767 master-0 kubenswrapper[17644]: I0319 12:00:31.039668 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cc396b6-f9af-464a-a2fb-1376bc1400a9-kube-api-access\") pod \"installer-4-master-0\" (UID: \"5cc396b6-f9af-464a-a2fb-1376bc1400a9\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:00:31.198039 master-0 kubenswrapper[17644]: I0319 12:00:31.197930 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:00:31.637709 master-0 kubenswrapper[17644]: I0319 12:00:31.637473 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 19 12:00:31.644219 master-0 kubenswrapper[17644]: W0319 12:00:31.644121 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5cc396b6_f9af_464a_a2fb_1376bc1400a9.slice/crio-b11749f56ff0cdddc12f021d9d937dc60a6e0e6132d952c506c751da893297a4 WatchSource:0}: Error finding container b11749f56ff0cdddc12f021d9d937dc60a6e0e6132d952c506c751da893297a4: Status 404 returned error can't find the container with id b11749f56ff0cdddc12f021d9d937dc60a6e0e6132d952c506c751da893297a4 Mar 19 12:00:32.383685 master-0 kubenswrapper[17644]: I0319 12:00:32.383506 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"5cc396b6-f9af-464a-a2fb-1376bc1400a9","Type":"ContainerStarted","Data":"5c904af7c630cd3f621fba91f5013ec4d5c84bc0d039fc46452ea52641e9a92a"} Mar 19 12:00:32.383685 master-0 kubenswrapper[17644]: I0319 12:00:32.383567 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"5cc396b6-f9af-464a-a2fb-1376bc1400a9","Type":"ContainerStarted","Data":"b11749f56ff0cdddc12f021d9d937dc60a6e0e6132d952c506c751da893297a4"} Mar 19 12:00:32.409213 master-0 kubenswrapper[17644]: I0319 12:00:32.409125 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-4-master-0" podStartSLOduration=2.409108664 podStartE2EDuration="2.409108664s" podCreationTimestamp="2026-03-19 12:00:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:00:32.407372551 +0000 UTC m=+66.177330596" watchObservedRunningTime="2026-03-19 12:00:32.409108664 +0000 UTC m=+66.179066689" Mar 19 12:00:40.673688 master-0 kubenswrapper[17644]: I0319 12:00:40.673616 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access\") pod \"installer-3-master-0\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 12:00:40.676756 master-0 kubenswrapper[17644]: I0319 12:00:40.676679 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access\") pod \"installer-3-master-0\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 12:00:40.774956 master-0 kubenswrapper[17644]: I0319 12:00:40.774872 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access\") pod \"1c576a88-6da4-43e9-a373-0df27a029f59\" (UID: \"1c576a88-6da4-43e9-a373-0df27a029f59\") " Mar 19 12:00:40.777889 master-0 kubenswrapper[17644]: I0319 12:00:40.777822 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1c576a88-6da4-43e9-a373-0df27a029f59" (UID: "1c576a88-6da4-43e9-a373-0df27a029f59"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:00:40.877344 master-0 kubenswrapper[17644]: I0319 12:00:40.877254 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c576a88-6da4-43e9-a373-0df27a029f59-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 12:00:45.544198 master-0 kubenswrapper[17644]: I0319 12:00:45.544138 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 19 12:00:45.544816 master-0 kubenswrapper[17644]: I0319 12:00:45.544411 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-4-master-0" podUID="5cc396b6-f9af-464a-a2fb-1376bc1400a9" containerName="installer" containerID="cri-o://5c904af7c630cd3f621fba91f5013ec4d5c84bc0d039fc46452ea52641e9a92a" gracePeriod=30 Mar 19 12:00:48.747783 master-0 kubenswrapper[17644]: I0319 12:00:48.746922 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 19 12:00:48.748718 master-0 kubenswrapper[17644]: I0319 12:00:48.748116 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:00:48.764708 master-0 kubenswrapper[17644]: I0319 12:00:48.764643 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 19 12:00:48.812782 master-0 kubenswrapper[17644]: I0319 12:00:48.812682 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92-kube-api-access\") pod \"installer-5-master-0\" (UID: \"2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:00:48.812782 master-0 kubenswrapper[17644]: I0319 12:00:48.812762 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92-var-lock\") pod \"installer-5-master-0\" (UID: \"2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:00:48.812782 master-0 kubenswrapper[17644]: I0319 12:00:48.812783 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:00:48.914755 master-0 kubenswrapper[17644]: I0319 12:00:48.914680 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92-kube-api-access\") pod \"installer-5-master-0\" (UID: \"2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:00:48.914755 master-0 kubenswrapper[17644]: I0319 12:00:48.914737 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92-var-lock\") pod \"installer-5-master-0\" (UID: \"2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:00:48.915067 master-0 kubenswrapper[17644]: I0319 12:00:48.914787 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:00:48.915067 master-0 kubenswrapper[17644]: I0319 12:00:48.914900 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:00:48.915067 master-0 kubenswrapper[17644]: I0319 12:00:48.914940 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92-var-lock\") pod \"installer-5-master-0\" (UID: \"2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:00:48.932586 master-0 kubenswrapper[17644]: I0319 12:00:48.932530 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92-kube-api-access\") pod \"installer-5-master-0\" (UID: \"2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:00:49.076900 master-0 kubenswrapper[17644]: I0319 12:00:49.076706 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:00:49.473190 master-0 kubenswrapper[17644]: I0319 12:00:49.473123 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 19 12:00:49.483746 master-0 kubenswrapper[17644]: W0319 12:00:49.483676 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2dc94fa1_b977_4f27_9a30_f9dc6cbbfe92.slice/crio-9bbbb23c49f69bbd9dd929f52583c52ed1da912b0840ba0c18e8e5ac11b5ef56 WatchSource:0}: Error finding container 9bbbb23c49f69bbd9dd929f52583c52ed1da912b0840ba0c18e8e5ac11b5ef56: Status 404 returned error can't find the container with id 9bbbb23c49f69bbd9dd929f52583c52ed1da912b0840ba0c18e8e5ac11b5ef56 Mar 19 12:00:50.496096 master-0 kubenswrapper[17644]: I0319 12:00:50.496026 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92","Type":"ContainerStarted","Data":"43c029bd08cf161809d07bda47da6474003e9b31432977aff7b64679327e7530"} Mar 19 12:00:50.496096 master-0 kubenswrapper[17644]: I0319 12:00:50.496080 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92","Type":"ContainerStarted","Data":"9bbbb23c49f69bbd9dd929f52583c52ed1da912b0840ba0c18e8e5ac11b5ef56"} Mar 19 12:00:50.518690 master-0 kubenswrapper[17644]: I0319 12:00:50.518567 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-5-master-0" podStartSLOduration=2.518534262 podStartE2EDuration="2.518534262s" podCreationTimestamp="2026-03-19 12:00:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:00:50.516575363 +0000 UTC m=+84.286533408" watchObservedRunningTime="2026-03-19 12:00:50.518534262 +0000 UTC m=+84.288492297" Mar 19 12:01:00.291304 master-0 kubenswrapper[17644]: I0319 12:01:00.291200 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca\") pod \"console-operator-76b6568d85-8bvjj\" (UID: \"d2fd7597-cd7a-4138-bb3c-01681c569bd3\") " pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 12:01:00.293485 master-0 kubenswrapper[17644]: E0319 12:01:00.291409 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca podName:d2fd7597-cd7a-4138-bb3c-01681c569bd3 nodeName:}" failed. No retries permitted until 2026-03-19 12:02:04.29138977 +0000 UTC m=+158.061347805 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca") pod "console-operator-76b6568d85-8bvjj" (UID: "d2fd7597-cd7a-4138-bb3c-01681c569bd3") : configmap references non-existent config key: ca-bundle.crt Mar 19 12:01:02.577397 master-0 kubenswrapper[17644]: I0319 12:01:02.577345 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_5cc396b6-f9af-464a-a2fb-1376bc1400a9/installer/0.log" Mar 19 12:01:02.577397 master-0 kubenswrapper[17644]: I0319 12:01:02.577398 17644 generic.go:334] "Generic (PLEG): container finished" podID="5cc396b6-f9af-464a-a2fb-1376bc1400a9" containerID="5c904af7c630cd3f621fba91f5013ec4d5c84bc0d039fc46452ea52641e9a92a" exitCode=1 Mar 19 12:01:02.578101 master-0 kubenswrapper[17644]: I0319 12:01:02.577432 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"5cc396b6-f9af-464a-a2fb-1376bc1400a9","Type":"ContainerDied","Data":"5c904af7c630cd3f621fba91f5013ec4d5c84bc0d039fc46452ea52641e9a92a"} Mar 19 12:01:02.841215 master-0 kubenswrapper[17644]: I0319 12:01:02.841161 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_5cc396b6-f9af-464a-a2fb-1376bc1400a9/installer/0.log" Mar 19 12:01:02.841424 master-0 kubenswrapper[17644]: I0319 12:01:02.841255 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:01:02.928487 master-0 kubenswrapper[17644]: I0319 12:01:02.928421 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5cc396b6-f9af-464a-a2fb-1376bc1400a9-var-lock\") pod \"5cc396b6-f9af-464a-a2fb-1376bc1400a9\" (UID: \"5cc396b6-f9af-464a-a2fb-1376bc1400a9\") " Mar 19 12:01:02.928706 master-0 kubenswrapper[17644]: I0319 12:01:02.928605 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cc396b6-f9af-464a-a2fb-1376bc1400a9-kube-api-access\") pod \"5cc396b6-f9af-464a-a2fb-1376bc1400a9\" (UID: \"5cc396b6-f9af-464a-a2fb-1376bc1400a9\") " Mar 19 12:01:02.928706 master-0 kubenswrapper[17644]: I0319 12:01:02.928636 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5cc396b6-f9af-464a-a2fb-1376bc1400a9-kubelet-dir\") pod \"5cc396b6-f9af-464a-a2fb-1376bc1400a9\" (UID: \"5cc396b6-f9af-464a-a2fb-1376bc1400a9\") " Mar 19 12:01:02.928918 master-0 kubenswrapper[17644]: I0319 12:01:02.928879 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5cc396b6-f9af-464a-a2fb-1376bc1400a9-var-lock" (OuterVolumeSpecName: "var-lock") pod "5cc396b6-f9af-464a-a2fb-1376bc1400a9" (UID: "5cc396b6-f9af-464a-a2fb-1376bc1400a9"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:01:02.928956 master-0 kubenswrapper[17644]: I0319 12:01:02.928929 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5cc396b6-f9af-464a-a2fb-1376bc1400a9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5cc396b6-f9af-464a-a2fb-1376bc1400a9" (UID: "5cc396b6-f9af-464a-a2fb-1376bc1400a9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:01:02.929038 master-0 kubenswrapper[17644]: I0319 12:01:02.929013 17644 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5cc396b6-f9af-464a-a2fb-1376bc1400a9-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:01:02.929038 master-0 kubenswrapper[17644]: I0319 12:01:02.929035 17644 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5cc396b6-f9af-464a-a2fb-1376bc1400a9-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:01:02.931897 master-0 kubenswrapper[17644]: I0319 12:01:02.931831 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cc396b6-f9af-464a-a2fb-1376bc1400a9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5cc396b6-f9af-464a-a2fb-1376bc1400a9" (UID: "5cc396b6-f9af-464a-a2fb-1376bc1400a9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:01:03.030752 master-0 kubenswrapper[17644]: I0319 12:01:03.030685 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cc396b6-f9af-464a-a2fb-1376bc1400a9-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 12:01:03.587125 master-0 kubenswrapper[17644]: I0319 12:01:03.587059 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_5cc396b6-f9af-464a-a2fb-1376bc1400a9/installer/0.log" Mar 19 12:01:03.587125 master-0 kubenswrapper[17644]: I0319 12:01:03.587127 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"5cc396b6-f9af-464a-a2fb-1376bc1400a9","Type":"ContainerDied","Data":"b11749f56ff0cdddc12f021d9d937dc60a6e0e6132d952c506c751da893297a4"} Mar 19 12:01:03.587865 master-0 kubenswrapper[17644]: I0319 12:01:03.587175 17644 scope.go:117] "RemoveContainer" containerID="5c904af7c630cd3f621fba91f5013ec4d5c84bc0d039fc46452ea52641e9a92a" Mar 19 12:01:03.587865 master-0 kubenswrapper[17644]: I0319 12:01:03.587178 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:01:03.750197 master-0 kubenswrapper[17644]: I0319 12:01:03.750141 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 19 12:01:03.792017 master-0 kubenswrapper[17644]: I0319 12:01:03.791949 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 19 12:01:04.490798 master-0 kubenswrapper[17644]: I0319 12:01:04.490714 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cc396b6-f9af-464a-a2fb-1376bc1400a9" path="/var/lib/kubelet/pods/5cc396b6-f9af-464a-a2fb-1376bc1400a9/volumes" Mar 19 12:01:47.572774 master-0 kubenswrapper[17644]: I0319 12:01:47.572007 17644 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 12:01:47.572774 master-0 kubenswrapper[17644]: E0319 12:01:47.572430 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cc396b6-f9af-464a-a2fb-1376bc1400a9" containerName="installer" Mar 19 12:01:47.572774 master-0 kubenswrapper[17644]: I0319 12:01:47.572446 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cc396b6-f9af-464a-a2fb-1376bc1400a9" containerName="installer" Mar 19 12:01:47.573650 master-0 kubenswrapper[17644]: I0319 12:01:47.572824 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cc396b6-f9af-464a-a2fb-1376bc1400a9" containerName="installer" Mar 19 12:01:47.573650 master-0 kubenswrapper[17644]: I0319 12:01:47.573329 17644 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 12:01:47.573650 master-0 kubenswrapper[17644]: I0319 12:01:47.573485 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:01:47.573650 master-0 kubenswrapper[17644]: I0319 12:01:47.573614 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver" containerID="cri-o://991feed2291ac9a84bafc878cdc07f7aa3c0c5e50e56fe23c94905ee545d3fbd" gracePeriod=15 Mar 19 12:01:47.573843 master-0 kubenswrapper[17644]: I0319 12:01:47.573658 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://f9eb29fd4fd09864a6d14d4e4d10b2022ffb83f13ece47bbceaba5b7bd3c3dd4" gracePeriod=15 Mar 19 12:01:47.573843 master-0 kubenswrapper[17644]: I0319 12:01:47.573784 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-syncer" containerID="cri-o://2a13852b86c512a96024f86ef51091c77de4129071d7f100f1b56772f75c4778" gracePeriod=15 Mar 19 12:01:47.573843 master-0 kubenswrapper[17644]: I0319 12:01:47.573658 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-check-endpoints" containerID="cri-o://9de1b7d1e2349a3f400091678dc99b0b7f1f76ada96d88181116adceec1f835c" gracePeriod=15 Mar 19 12:01:47.573967 master-0 kubenswrapper[17644]: I0319 12:01:47.573876 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://f96d175f0cd36aaf469f89a28d2a0993ca551b8d590ac0d9e0b11a56d879ec29" gracePeriod=15 Mar 19 12:01:47.577757 master-0 kubenswrapper[17644]: I0319 12:01:47.575646 17644 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 12:01:47.577757 master-0 kubenswrapper[17644]: E0319 12:01:47.576190 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 12:01:47.577757 master-0 kubenswrapper[17644]: I0319 12:01:47.576209 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 12:01:47.577757 master-0 kubenswrapper[17644]: E0319 12:01:47.576246 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-check-endpoints" Mar 19 12:01:47.577757 master-0 kubenswrapper[17644]: I0319 12:01:47.576258 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-check-endpoints" Mar 19 12:01:47.577757 master-0 kubenswrapper[17644]: E0319 12:01:47.576287 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-insecure-readyz" Mar 19 12:01:47.577757 master-0 kubenswrapper[17644]: I0319 12:01:47.576294 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-insecure-readyz" Mar 19 12:01:47.577757 master-0 kubenswrapper[17644]: E0319 12:01:47.576314 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-syncer" Mar 19 12:01:47.577757 master-0 kubenswrapper[17644]: I0319 12:01:47.576321 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-syncer" Mar 19 12:01:47.577757 master-0 kubenswrapper[17644]: E0319 12:01:47.576331 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="setup" Mar 19 12:01:47.577757 master-0 kubenswrapper[17644]: I0319 12:01:47.576338 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="setup" Mar 19 12:01:47.577757 master-0 kubenswrapper[17644]: E0319 12:01:47.576352 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver" Mar 19 12:01:47.577757 master-0 kubenswrapper[17644]: I0319 12:01:47.576361 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver" Mar 19 12:01:47.577757 master-0 kubenswrapper[17644]: I0319 12:01:47.576483 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-check-endpoints" Mar 19 12:01:47.577757 master-0 kubenswrapper[17644]: I0319 12:01:47.576506 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 12:01:47.577757 master-0 kubenswrapper[17644]: I0319 12:01:47.576528 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-insecure-readyz" Mar 19 12:01:47.577757 master-0 kubenswrapper[17644]: I0319 12:01:47.576538 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-check-endpoints" Mar 19 12:01:47.577757 master-0 kubenswrapper[17644]: I0319 12:01:47.576577 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-syncer" Mar 19 12:01:47.577757 master-0 kubenswrapper[17644]: I0319 12:01:47.576592 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver" Mar 19 12:01:47.577757 master-0 kubenswrapper[17644]: E0319 12:01:47.576748 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-check-endpoints" Mar 19 12:01:47.577757 master-0 kubenswrapper[17644]: I0319 12:01:47.576757 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-check-endpoints" Mar 19 12:01:47.669996 master-0 kubenswrapper[17644]: E0319 12:01:47.669931 17644 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:01:47.670991 master-0 kubenswrapper[17644]: I0319 12:01:47.670933 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"d5f502b117c7c8479f7f20848a50fec0\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:01:47.671101 master-0 kubenswrapper[17644]: I0319 12:01:47.671079 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:01:47.671198 master-0 kubenswrapper[17644]: I0319 12:01:47.671183 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"d5f502b117c7c8479f7f20848a50fec0\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:01:47.671288 master-0 kubenswrapper[17644]: I0319 12:01:47.671275 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:01:47.671363 master-0 kubenswrapper[17644]: I0319 12:01:47.671351 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:01:47.671447 master-0 kubenswrapper[17644]: I0319 12:01:47.671434 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:01:47.671532 master-0 kubenswrapper[17644]: I0319 12:01:47.671520 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"d5f502b117c7c8479f7f20848a50fec0\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:01:47.671662 master-0 kubenswrapper[17644]: I0319 12:01:47.671647 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:01:47.773039 master-0 kubenswrapper[17644]: I0319 12:01:47.772976 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:01:47.773039 master-0 kubenswrapper[17644]: I0319 12:01:47.773038 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:01:47.773301 master-0 kubenswrapper[17644]: I0319 12:01:47.773130 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:01:47.773301 master-0 kubenswrapper[17644]: I0319 12:01:47.773230 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:01:47.773301 master-0 kubenswrapper[17644]: I0319 12:01:47.773289 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:01:47.773301 master-0 kubenswrapper[17644]: I0319 12:01:47.773263 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:01:47.773658 master-0 kubenswrapper[17644]: I0319 12:01:47.773382 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"d5f502b117c7c8479f7f20848a50fec0\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:01:47.773769 master-0 kubenswrapper[17644]: I0319 12:01:47.773715 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"d5f502b117c7c8479f7f20848a50fec0\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:01:47.773886 master-0 kubenswrapper[17644]: I0319 12:01:47.773850 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:01:47.774021 master-0 kubenswrapper[17644]: I0319 12:01:47.773991 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"d5f502b117c7c8479f7f20848a50fec0\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:01:47.774080 master-0 kubenswrapper[17644]: I0319 12:01:47.774026 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:01:47.774080 master-0 kubenswrapper[17644]: I0319 12:01:47.774051 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"d5f502b117c7c8479f7f20848a50fec0\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:01:47.774171 master-0 kubenswrapper[17644]: I0319 12:01:47.774147 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"d5f502b117c7c8479f7f20848a50fec0\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:01:47.774219 master-0 kubenswrapper[17644]: I0319 12:01:47.774189 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:01:47.774219 master-0 kubenswrapper[17644]: I0319 12:01:47.774213 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"d5f502b117c7c8479f7f20848a50fec0\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:01:47.774319 master-0 kubenswrapper[17644]: I0319 12:01:47.774238 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:01:47.851710 master-0 kubenswrapper[17644]: I0319 12:01:47.851523 17644 generic.go:334] "Generic (PLEG): container finished" podID="2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92" containerID="43c029bd08cf161809d07bda47da6474003e9b31432977aff7b64679327e7530" exitCode=0 Mar 19 12:01:47.851710 master-0 kubenswrapper[17644]: I0319 12:01:47.851618 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92","Type":"ContainerDied","Data":"43c029bd08cf161809d07bda47da6474003e9b31432977aff7b64679327e7530"} Mar 19 12:01:47.852897 master-0 kubenswrapper[17644]: I0319 12:01:47.852848 17644 status_manager.go:851] "Failed to get status for pod" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:01:47.853546 master-0 kubenswrapper[17644]: I0319 12:01:47.853498 17644 status_manager.go:851] "Failed to get status for pod" podUID="2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:01:47.854115 master-0 kubenswrapper[17644]: I0319 12:01:47.854084 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b45ea2ef1cf2bc9d1d994d6538ae0a64/kube-apiserver-check-endpoints/0.log" Mar 19 12:01:47.855879 master-0 kubenswrapper[17644]: I0319 12:01:47.855855 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b45ea2ef1cf2bc9d1d994d6538ae0a64/kube-apiserver-cert-syncer/0.log" Mar 19 12:01:47.856676 master-0 kubenswrapper[17644]: I0319 12:01:47.856640 17644 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="9de1b7d1e2349a3f400091678dc99b0b7f1f76ada96d88181116adceec1f835c" exitCode=0 Mar 19 12:01:47.856676 master-0 kubenswrapper[17644]: I0319 12:01:47.856661 17644 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="f96d175f0cd36aaf469f89a28d2a0993ca551b8d590ac0d9e0b11a56d879ec29" exitCode=0 Mar 19 12:01:47.856676 master-0 kubenswrapper[17644]: I0319 12:01:47.856668 17644 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="f9eb29fd4fd09864a6d14d4e4d10b2022ffb83f13ece47bbceaba5b7bd3c3dd4" exitCode=0 Mar 19 12:01:47.856676 master-0 kubenswrapper[17644]: I0319 12:01:47.856680 17644 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="2a13852b86c512a96024f86ef51091c77de4129071d7f100f1b56772f75c4778" exitCode=2 Mar 19 12:01:47.856863 master-0 kubenswrapper[17644]: I0319 12:01:47.856719 17644 scope.go:117] "RemoveContainer" containerID="3012b2963902713916d9cd34e1392325e6497b856aefe8cee37b525fe08e7328" Mar 19 12:01:47.971293 master-0 kubenswrapper[17644]: I0319 12:01:47.971195 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:01:47.998060 master-0 kubenswrapper[17644]: E0319 12:01:47.997863 17644 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189e3c65106de680 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:85632c1cec8974aa874834e4cfff4c77,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 12:01:47.995555456 +0000 UTC m=+141.765513491,LastTimestamp:2026-03-19 12:01:47.995555456 +0000 UTC m=+141.765513491,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 12:01:48.867670 master-0 kubenswrapper[17644]: I0319 12:01:48.867580 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b45ea2ef1cf2bc9d1d994d6538ae0a64/kube-apiserver-cert-syncer/0.log" Mar 19 12:01:48.870851 master-0 kubenswrapper[17644]: I0319 12:01:48.870674 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"85632c1cec8974aa874834e4cfff4c77","Type":"ContainerStarted","Data":"39dc59fe7a4082b0e522e88e6a942490b0f3386aa516717a966f593be1d80d96"} Mar 19 12:01:48.871204 master-0 kubenswrapper[17644]: I0319 12:01:48.870865 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"85632c1cec8974aa874834e4cfff4c77","Type":"ContainerStarted","Data":"e9de44d18504c66c6c998d0fcfdc8f6a05ac0d13f9aa3044b3c8048066e1b4d6"} Mar 19 12:01:48.873269 master-0 kubenswrapper[17644]: E0319 12:01:48.872320 17644 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:01:48.873496 master-0 kubenswrapper[17644]: I0319 12:01:48.873271 17644 status_manager.go:851] "Failed to get status for pod" podUID="2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:01:49.173916 master-0 kubenswrapper[17644]: I0319 12:01:49.173882 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:01:49.175408 master-0 kubenswrapper[17644]: I0319 12:01:49.175265 17644 status_manager.go:851] "Failed to get status for pod" podUID="2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:01:49.299266 master-0 kubenswrapper[17644]: I0319 12:01:49.299196 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92-kubelet-dir\") pod \"2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92\" (UID: \"2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92\") " Mar 19 12:01:49.299646 master-0 kubenswrapper[17644]: I0319 12:01:49.299362 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92" (UID: "2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:01:49.299876 master-0 kubenswrapper[17644]: I0319 12:01:49.299858 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92-var-lock\") pod \"2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92\" (UID: \"2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92\") " Mar 19 12:01:49.300025 master-0 kubenswrapper[17644]: I0319 12:01:49.300007 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92-kube-api-access\") pod \"2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92\" (UID: \"2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92\") " Mar 19 12:01:49.300216 master-0 kubenswrapper[17644]: I0319 12:01:49.299903 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92-var-lock" (OuterVolumeSpecName: "var-lock") pod "2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92" (UID: "2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:01:49.300521 master-0 kubenswrapper[17644]: I0319 12:01:49.300502 17644 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:01:49.300620 master-0 kubenswrapper[17644]: I0319 12:01:49.300605 17644 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:01:49.302938 master-0 kubenswrapper[17644]: I0319 12:01:49.302870 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92" (UID: "2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:01:49.402869 master-0 kubenswrapper[17644]: I0319 12:01:49.402698 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 12:01:49.878880 master-0 kubenswrapper[17644]: I0319 12:01:49.878809 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92","Type":"ContainerDied","Data":"9bbbb23c49f69bbd9dd929f52583c52ed1da912b0840ba0c18e8e5ac11b5ef56"} Mar 19 12:01:49.878880 master-0 kubenswrapper[17644]: I0319 12:01:49.878859 17644 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bbbb23c49f69bbd9dd929f52583c52ed1da912b0840ba0c18e8e5ac11b5ef56" Mar 19 12:01:49.879431 master-0 kubenswrapper[17644]: I0319 12:01:49.878889 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:01:49.941853 master-0 kubenswrapper[17644]: I0319 12:01:49.941795 17644 status_manager.go:851] "Failed to get status for pod" podUID="2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:01:49.948027 master-0 kubenswrapper[17644]: I0319 12:01:49.947987 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b45ea2ef1cf2bc9d1d994d6538ae0a64/kube-apiserver-cert-syncer/0.log" Mar 19 12:01:49.948794 master-0 kubenswrapper[17644]: I0319 12:01:49.948758 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:01:49.949952 master-0 kubenswrapper[17644]: I0319 12:01:49.949912 17644 status_manager.go:851] "Failed to get status for pod" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:01:49.950563 master-0 kubenswrapper[17644]: I0319 12:01:49.950508 17644 status_manager.go:851] "Failed to get status for pod" podUID="2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:01:50.113443 master-0 kubenswrapper[17644]: I0319 12:01:50.113277 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") pod \"b45ea2ef1cf2bc9d1d994d6538ae0a64\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " Mar 19 12:01:50.113443 master-0 kubenswrapper[17644]: I0319 12:01:50.113416 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") pod \"b45ea2ef1cf2bc9d1d994d6538ae0a64\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " Mar 19 12:01:50.113443 master-0 kubenswrapper[17644]: I0319 12:01:50.113455 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") pod \"b45ea2ef1cf2bc9d1d994d6538ae0a64\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " Mar 19 12:01:50.113796 master-0 kubenswrapper[17644]: I0319 12:01:50.113454 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "b45ea2ef1cf2bc9d1d994d6538ae0a64" (UID: "b45ea2ef1cf2bc9d1d994d6538ae0a64"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:01:50.113796 master-0 kubenswrapper[17644]: I0319 12:01:50.113623 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "b45ea2ef1cf2bc9d1d994d6538ae0a64" (UID: "b45ea2ef1cf2bc9d1d994d6538ae0a64"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:01:50.113796 master-0 kubenswrapper[17644]: I0319 12:01:50.113642 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "b45ea2ef1cf2bc9d1d994d6538ae0a64" (UID: "b45ea2ef1cf2bc9d1d994d6538ae0a64"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:01:50.114055 master-0 kubenswrapper[17644]: I0319 12:01:50.114021 17644 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:01:50.114055 master-0 kubenswrapper[17644]: I0319 12:01:50.114047 17644 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:01:50.114124 master-0 kubenswrapper[17644]: I0319 12:01:50.114058 17644 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:01:50.490773 master-0 kubenswrapper[17644]: I0319 12:01:50.490673 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" path="/var/lib/kubelet/pods/b45ea2ef1cf2bc9d1d994d6538ae0a64/volumes" Mar 19 12:01:50.887125 master-0 kubenswrapper[17644]: I0319 12:01:50.886978 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b45ea2ef1cf2bc9d1d994d6538ae0a64/kube-apiserver-cert-syncer/0.log" Mar 19 12:01:50.888454 master-0 kubenswrapper[17644]: I0319 12:01:50.888414 17644 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="991feed2291ac9a84bafc878cdc07f7aa3c0c5e50e56fe23c94905ee545d3fbd" exitCode=0 Mar 19 12:01:50.888588 master-0 kubenswrapper[17644]: I0319 12:01:50.888549 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:01:50.889825 master-0 kubenswrapper[17644]: I0319 12:01:50.889700 17644 status_manager.go:851] "Failed to get status for pod" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:01:50.889918 master-0 kubenswrapper[17644]: I0319 12:01:50.889620 17644 scope.go:117] "RemoveContainer" containerID="9de1b7d1e2349a3f400091678dc99b0b7f1f76ada96d88181116adceec1f835c" Mar 19 12:01:50.890504 master-0 kubenswrapper[17644]: I0319 12:01:50.890456 17644 status_manager.go:851] "Failed to get status for pod" podUID="2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:01:50.895332 master-0 kubenswrapper[17644]: I0319 12:01:50.895151 17644 status_manager.go:851] "Failed to get status for pod" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:01:50.896490 master-0 kubenswrapper[17644]: I0319 12:01:50.896422 17644 status_manager.go:851] "Failed to get status for pod" podUID="2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:01:50.906664 master-0 kubenswrapper[17644]: I0319 12:01:50.906369 17644 scope.go:117] "RemoveContainer" containerID="f96d175f0cd36aaf469f89a28d2a0993ca551b8d590ac0d9e0b11a56d879ec29" Mar 19 12:01:50.922116 master-0 kubenswrapper[17644]: I0319 12:01:50.922076 17644 scope.go:117] "RemoveContainer" containerID="f9eb29fd4fd09864a6d14d4e4d10b2022ffb83f13ece47bbceaba5b7bd3c3dd4" Mar 19 12:01:50.936157 master-0 kubenswrapper[17644]: I0319 12:01:50.935145 17644 scope.go:117] "RemoveContainer" containerID="2a13852b86c512a96024f86ef51091c77de4129071d7f100f1b56772f75c4778" Mar 19 12:01:50.954616 master-0 kubenswrapper[17644]: I0319 12:01:50.954567 17644 scope.go:117] "RemoveContainer" containerID="991feed2291ac9a84bafc878cdc07f7aa3c0c5e50e56fe23c94905ee545d3fbd" Mar 19 12:01:50.969347 master-0 kubenswrapper[17644]: I0319 12:01:50.969328 17644 scope.go:117] "RemoveContainer" containerID="a5f86fdf43005285d71f6c8884db1a78fa394b1a17074bd7f8a4187de0fcd0ff" Mar 19 12:01:50.985291 master-0 kubenswrapper[17644]: I0319 12:01:50.985187 17644 scope.go:117] "RemoveContainer" containerID="9de1b7d1e2349a3f400091678dc99b0b7f1f76ada96d88181116adceec1f835c" Mar 19 12:01:50.985822 master-0 kubenswrapper[17644]: E0319 12:01:50.985789 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9de1b7d1e2349a3f400091678dc99b0b7f1f76ada96d88181116adceec1f835c\": container with ID starting with 9de1b7d1e2349a3f400091678dc99b0b7f1f76ada96d88181116adceec1f835c not found: ID does not exist" containerID="9de1b7d1e2349a3f400091678dc99b0b7f1f76ada96d88181116adceec1f835c" Mar 19 12:01:50.985887 master-0 kubenswrapper[17644]: I0319 12:01:50.985854 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9de1b7d1e2349a3f400091678dc99b0b7f1f76ada96d88181116adceec1f835c"} err="failed to get container status \"9de1b7d1e2349a3f400091678dc99b0b7f1f76ada96d88181116adceec1f835c\": rpc error: code = NotFound desc = could not find container \"9de1b7d1e2349a3f400091678dc99b0b7f1f76ada96d88181116adceec1f835c\": container with ID starting with 9de1b7d1e2349a3f400091678dc99b0b7f1f76ada96d88181116adceec1f835c not found: ID does not exist" Mar 19 12:01:50.985929 master-0 kubenswrapper[17644]: I0319 12:01:50.985889 17644 scope.go:117] "RemoveContainer" containerID="f96d175f0cd36aaf469f89a28d2a0993ca551b8d590ac0d9e0b11a56d879ec29" Mar 19 12:01:50.986381 master-0 kubenswrapper[17644]: E0319 12:01:50.986354 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f96d175f0cd36aaf469f89a28d2a0993ca551b8d590ac0d9e0b11a56d879ec29\": container with ID starting with f96d175f0cd36aaf469f89a28d2a0993ca551b8d590ac0d9e0b11a56d879ec29 not found: ID does not exist" containerID="f96d175f0cd36aaf469f89a28d2a0993ca551b8d590ac0d9e0b11a56d879ec29" Mar 19 12:01:50.986485 master-0 kubenswrapper[17644]: I0319 12:01:50.986460 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f96d175f0cd36aaf469f89a28d2a0993ca551b8d590ac0d9e0b11a56d879ec29"} err="failed to get container status \"f96d175f0cd36aaf469f89a28d2a0993ca551b8d590ac0d9e0b11a56d879ec29\": rpc error: code = NotFound desc = could not find container \"f96d175f0cd36aaf469f89a28d2a0993ca551b8d590ac0d9e0b11a56d879ec29\": container with ID starting with f96d175f0cd36aaf469f89a28d2a0993ca551b8d590ac0d9e0b11a56d879ec29 not found: ID does not exist" Mar 19 12:01:50.986560 master-0 kubenswrapper[17644]: I0319 12:01:50.986547 17644 scope.go:117] "RemoveContainer" containerID="f9eb29fd4fd09864a6d14d4e4d10b2022ffb83f13ece47bbceaba5b7bd3c3dd4" Mar 19 12:01:50.986925 master-0 kubenswrapper[17644]: E0319 12:01:50.986905 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9eb29fd4fd09864a6d14d4e4d10b2022ffb83f13ece47bbceaba5b7bd3c3dd4\": container with ID starting with f9eb29fd4fd09864a6d14d4e4d10b2022ffb83f13ece47bbceaba5b7bd3c3dd4 not found: ID does not exist" containerID="f9eb29fd4fd09864a6d14d4e4d10b2022ffb83f13ece47bbceaba5b7bd3c3dd4" Mar 19 12:01:50.987061 master-0 kubenswrapper[17644]: I0319 12:01:50.987041 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9eb29fd4fd09864a6d14d4e4d10b2022ffb83f13ece47bbceaba5b7bd3c3dd4"} err="failed to get container status \"f9eb29fd4fd09864a6d14d4e4d10b2022ffb83f13ece47bbceaba5b7bd3c3dd4\": rpc error: code = NotFound desc = could not find container \"f9eb29fd4fd09864a6d14d4e4d10b2022ffb83f13ece47bbceaba5b7bd3c3dd4\": container with ID starting with f9eb29fd4fd09864a6d14d4e4d10b2022ffb83f13ece47bbceaba5b7bd3c3dd4 not found: ID does not exist" Mar 19 12:01:50.987132 master-0 kubenswrapper[17644]: I0319 12:01:50.987121 17644 scope.go:117] "RemoveContainer" containerID="2a13852b86c512a96024f86ef51091c77de4129071d7f100f1b56772f75c4778" Mar 19 12:01:50.987464 master-0 kubenswrapper[17644]: E0319 12:01:50.987446 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a13852b86c512a96024f86ef51091c77de4129071d7f100f1b56772f75c4778\": container with ID starting with 2a13852b86c512a96024f86ef51091c77de4129071d7f100f1b56772f75c4778 not found: ID does not exist" containerID="2a13852b86c512a96024f86ef51091c77de4129071d7f100f1b56772f75c4778" Mar 19 12:01:50.987551 master-0 kubenswrapper[17644]: I0319 12:01:50.987534 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a13852b86c512a96024f86ef51091c77de4129071d7f100f1b56772f75c4778"} err="failed to get container status \"2a13852b86c512a96024f86ef51091c77de4129071d7f100f1b56772f75c4778\": rpc error: code = NotFound desc = could not find container \"2a13852b86c512a96024f86ef51091c77de4129071d7f100f1b56772f75c4778\": container with ID starting with 2a13852b86c512a96024f86ef51091c77de4129071d7f100f1b56772f75c4778 not found: ID does not exist" Mar 19 12:01:50.987611 master-0 kubenswrapper[17644]: I0319 12:01:50.987599 17644 scope.go:117] "RemoveContainer" containerID="991feed2291ac9a84bafc878cdc07f7aa3c0c5e50e56fe23c94905ee545d3fbd" Mar 19 12:01:50.987947 master-0 kubenswrapper[17644]: E0319 12:01:50.987918 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"991feed2291ac9a84bafc878cdc07f7aa3c0c5e50e56fe23c94905ee545d3fbd\": container with ID starting with 991feed2291ac9a84bafc878cdc07f7aa3c0c5e50e56fe23c94905ee545d3fbd not found: ID does not exist" containerID="991feed2291ac9a84bafc878cdc07f7aa3c0c5e50e56fe23c94905ee545d3fbd" Mar 19 12:01:50.988002 master-0 kubenswrapper[17644]: I0319 12:01:50.987975 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"991feed2291ac9a84bafc878cdc07f7aa3c0c5e50e56fe23c94905ee545d3fbd"} err="failed to get container status \"991feed2291ac9a84bafc878cdc07f7aa3c0c5e50e56fe23c94905ee545d3fbd\": rpc error: code = NotFound desc = could not find container \"991feed2291ac9a84bafc878cdc07f7aa3c0c5e50e56fe23c94905ee545d3fbd\": container with ID starting with 991feed2291ac9a84bafc878cdc07f7aa3c0c5e50e56fe23c94905ee545d3fbd not found: ID does not exist" Mar 19 12:01:50.988043 master-0 kubenswrapper[17644]: I0319 12:01:50.988000 17644 scope.go:117] "RemoveContainer" containerID="a5f86fdf43005285d71f6c8884db1a78fa394b1a17074bd7f8a4187de0fcd0ff" Mar 19 12:01:50.988366 master-0 kubenswrapper[17644]: E0319 12:01:50.988347 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5f86fdf43005285d71f6c8884db1a78fa394b1a17074bd7f8a4187de0fcd0ff\": container with ID starting with a5f86fdf43005285d71f6c8884db1a78fa394b1a17074bd7f8a4187de0fcd0ff not found: ID does not exist" containerID="a5f86fdf43005285d71f6c8884db1a78fa394b1a17074bd7f8a4187de0fcd0ff" Mar 19 12:01:50.988409 master-0 kubenswrapper[17644]: I0319 12:01:50.988366 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5f86fdf43005285d71f6c8884db1a78fa394b1a17074bd7f8a4187de0fcd0ff"} err="failed to get container status \"a5f86fdf43005285d71f6c8884db1a78fa394b1a17074bd7f8a4187de0fcd0ff\": rpc error: code = NotFound desc = could not find container \"a5f86fdf43005285d71f6c8884db1a78fa394b1a17074bd7f8a4187de0fcd0ff\": container with ID starting with a5f86fdf43005285d71f6c8884db1a78fa394b1a17074bd7f8a4187de0fcd0ff not found: ID does not exist" Mar 19 12:01:55.104915 master-0 kubenswrapper[17644]: E0319 12:01:55.104765 17644 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189e3c65106de680 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:85632c1cec8974aa874834e4cfff4c77,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 12:01:47.995555456 +0000 UTC m=+141.765513491,LastTimestamp:2026-03-19 12:01:47.995555456 +0000 UTC m=+141.765513491,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 12:01:56.364452 master-0 kubenswrapper[17644]: E0319 12:01:56.364379 17644 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:01:56.365361 master-0 kubenswrapper[17644]: E0319 12:01:56.365085 17644 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:01:56.365774 master-0 kubenswrapper[17644]: E0319 12:01:56.365720 17644 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:01:56.366283 master-0 kubenswrapper[17644]: E0319 12:01:56.366233 17644 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:01:56.366878 master-0 kubenswrapper[17644]: E0319 12:01:56.366822 17644 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:01:56.366878 master-0 kubenswrapper[17644]: I0319 12:01:56.366853 17644 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 12:01:56.367414 master-0 kubenswrapper[17644]: E0319 12:01:56.367375 17644 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 19 12:01:56.491116 master-0 kubenswrapper[17644]: I0319 12:01:56.491004 17644 status_manager.go:851] "Failed to get status for pod" podUID="2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:01:56.569427 master-0 kubenswrapper[17644]: E0319 12:01:56.569324 17644 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 19 12:01:56.971650 master-0 kubenswrapper[17644]: E0319 12:01:56.971548 17644 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 19 12:01:57.774055 master-0 kubenswrapper[17644]: E0319 12:01:57.773948 17644 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 19 12:01:59.147185 master-0 kubenswrapper[17644]: E0319 12:01:59.147108 17644 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[trusted-ca], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" podUID="d2fd7597-cd7a-4138-bb3c-01681c569bd3" Mar 19 12:01:59.375988 master-0 kubenswrapper[17644]: E0319 12:01:59.375919 17644 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 19 12:01:59.968705 master-0 kubenswrapper[17644]: I0319 12:01:59.968577 17644 generic.go:334] "Generic (PLEG): container finished" podID="c83737980b9ee109184b1d78e942cf36" containerID="a24c957c2955f33fcac616e1dace18be5248f20b6e9d2c791c70c17f3df96825" exitCode=1 Mar 19 12:01:59.969100 master-0 kubenswrapper[17644]: I0319 12:01:59.968708 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerDied","Data":"a24c957c2955f33fcac616e1dace18be5248f20b6e9d2c791c70c17f3df96825"} Mar 19 12:01:59.969100 master-0 kubenswrapper[17644]: I0319 12:01:59.968833 17644 scope.go:117] "RemoveContainer" containerID="901ed10fec5e9417fcd7522a27f15f9a949e9c0dd2ab8e429fd9b30afd0247bf" Mar 19 12:01:59.969100 master-0 kubenswrapper[17644]: I0319 12:01:59.968756 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 12:01:59.969968 master-0 kubenswrapper[17644]: I0319 12:01:59.969934 17644 scope.go:117] "RemoveContainer" containerID="a24c957c2955f33fcac616e1dace18be5248f20b6e9d2c791c70c17f3df96825" Mar 19 12:01:59.971280 master-0 kubenswrapper[17644]: I0319 12:01:59.970755 17644 status_manager.go:851] "Failed to get status for pod" podUID="c83737980b9ee109184b1d78e942cf36" pod="kube-system/bootstrap-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:01:59.971955 master-0 kubenswrapper[17644]: I0319 12:01:59.971868 17644 status_manager.go:851] "Failed to get status for pod" podUID="2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:02:00.483169 master-0 kubenswrapper[17644]: I0319 12:02:00.483030 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:02:00.484191 master-0 kubenswrapper[17644]: I0319 12:02:00.484148 17644 status_manager.go:851] "Failed to get status for pod" podUID="c83737980b9ee109184b1d78e942cf36" pod="kube-system/bootstrap-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:02:00.484643 master-0 kubenswrapper[17644]: I0319 12:02:00.484607 17644 status_manager.go:851] "Failed to get status for pod" podUID="2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:02:00.506277 master-0 kubenswrapper[17644]: I0319 12:02:00.506219 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="5aa761d6-a121-4236-8841-c87d16450405" Mar 19 12:02:00.506277 master-0 kubenswrapper[17644]: I0319 12:02:00.506255 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="5aa761d6-a121-4236-8841-c87d16450405" Mar 19 12:02:00.506979 master-0 kubenswrapper[17644]: E0319 12:02:00.506941 17644 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:02:00.507411 master-0 kubenswrapper[17644]: I0319 12:02:00.507381 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:02:00.523634 master-0 kubenswrapper[17644]: W0319 12:02:00.523590 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5f502b117c7c8479f7f20848a50fec0.slice/crio-3975dd76d35e8774800003a73e05a96c44014eb02b24ef58ae992358d3e8092e WatchSource:0}: Error finding container 3975dd76d35e8774800003a73e05a96c44014eb02b24ef58ae992358d3e8092e: Status 404 returned error can't find the container with id 3975dd76d35e8774800003a73e05a96c44014eb02b24ef58ae992358d3e8092e Mar 19 12:02:00.976188 master-0 kubenswrapper[17644]: I0319 12:02:00.976130 17644 generic.go:334] "Generic (PLEG): container finished" podID="d5f502b117c7c8479f7f20848a50fec0" containerID="07876832b075d68ba6beb4dc411493e861290319d0cf01c3458b0899ea761f52" exitCode=0 Mar 19 12:02:00.976417 master-0 kubenswrapper[17644]: I0319 12:02:00.976203 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"d5f502b117c7c8479f7f20848a50fec0","Type":"ContainerDied","Data":"07876832b075d68ba6beb4dc411493e861290319d0cf01c3458b0899ea761f52"} Mar 19 12:02:00.976417 master-0 kubenswrapper[17644]: I0319 12:02:00.976239 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"d5f502b117c7c8479f7f20848a50fec0","Type":"ContainerStarted","Data":"3975dd76d35e8774800003a73e05a96c44014eb02b24ef58ae992358d3e8092e"} Mar 19 12:02:00.976552 master-0 kubenswrapper[17644]: I0319 12:02:00.976523 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="5aa761d6-a121-4236-8841-c87d16450405" Mar 19 12:02:00.976552 master-0 kubenswrapper[17644]: I0319 12:02:00.976542 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="5aa761d6-a121-4236-8841-c87d16450405" Mar 19 12:02:00.977464 master-0 kubenswrapper[17644]: E0319 12:02:00.977413 17644 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:02:00.977559 master-0 kubenswrapper[17644]: I0319 12:02:00.977482 17644 status_manager.go:851] "Failed to get status for pod" podUID="2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:02:00.978576 master-0 kubenswrapper[17644]: I0319 12:02:00.978134 17644 status_manager.go:851] "Failed to get status for pod" podUID="c83737980b9ee109184b1d78e942cf36" pod="kube-system/bootstrap-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:02:00.979074 master-0 kubenswrapper[17644]: I0319 12:02:00.979039 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"ed2f9594479b46ec9f3ffcb6affe24cb0f9c8f73bdb2153419d01fd69d9d7cd6"} Mar 19 12:02:00.979652 master-0 kubenswrapper[17644]: I0319 12:02:00.979608 17644 status_manager.go:851] "Failed to get status for pod" podUID="c83737980b9ee109184b1d78e942cf36" pod="kube-system/bootstrap-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:02:00.980096 master-0 kubenswrapper[17644]: I0319 12:02:00.980001 17644 status_manager.go:851] "Failed to get status for pod" podUID="2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:02:02.005084 master-0 kubenswrapper[17644]: I0319 12:02:02.005016 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"d5f502b117c7c8479f7f20848a50fec0","Type":"ContainerStarted","Data":"ec7970ff26d79ac8592563e7fdd11f0fbccfd7a30338c99bafb57d01a40d0931"} Mar 19 12:02:02.005084 master-0 kubenswrapper[17644]: I0319 12:02:02.005067 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"d5f502b117c7c8479f7f20848a50fec0","Type":"ContainerStarted","Data":"8c6ec577468afadacead1caed5bc3d3288574f627159f0d3c6f0d8a6df5a21b8"} Mar 19 12:02:02.005084 master-0 kubenswrapper[17644]: I0319 12:02:02.005083 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"d5f502b117c7c8479f7f20848a50fec0","Type":"ContainerStarted","Data":"6d4b45c9a72c62e662a728a34abec71c261e7bd7fa4d79a2788c04639c984ee9"} Mar 19 12:02:02.005829 master-0 kubenswrapper[17644]: I0319 12:02:02.005117 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"d5f502b117c7c8479f7f20848a50fec0","Type":"ContainerStarted","Data":"35db4f3a681d398c15eb2bb62e101d7bd88e8f0abee04fde1dc85180feca1660"} Mar 19 12:02:03.014581 master-0 kubenswrapper[17644]: I0319 12:02:03.014524 17644 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="b533d36029413fb01ef12a682704ad486041204246c172de95f0a4aeff2f5180" exitCode=1 Mar 19 12:02:03.015167 master-0 kubenswrapper[17644]: I0319 12:02:03.014583 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerDied","Data":"b533d36029413fb01ef12a682704ad486041204246c172de95f0a4aeff2f5180"} Mar 19 12:02:03.015167 master-0 kubenswrapper[17644]: I0319 12:02:03.014683 17644 scope.go:117] "RemoveContainer" containerID="70e30dd45946084b4dbfa27658bf40bdaa54f00c37bf6e48547b5796a6b773e3" Mar 19 12:02:03.015259 master-0 kubenswrapper[17644]: I0319 12:02:03.015231 17644 scope.go:117] "RemoveContainer" containerID="b533d36029413fb01ef12a682704ad486041204246c172de95f0a4aeff2f5180" Mar 19 12:02:03.019479 master-0 kubenswrapper[17644]: I0319 12:02:03.019451 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"d5f502b117c7c8479f7f20848a50fec0","Type":"ContainerStarted","Data":"3f44dc03d98a097335438c4135778a6b234239f0ce338e05a88d3e7f3ae87ee9"} Mar 19 12:02:03.019715 master-0 kubenswrapper[17644]: I0319 12:02:03.019683 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="5aa761d6-a121-4236-8841-c87d16450405" Mar 19 12:02:03.019715 master-0 kubenswrapper[17644]: I0319 12:02:03.019711 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="5aa761d6-a121-4236-8841-c87d16450405" Mar 19 12:02:03.019853 master-0 kubenswrapper[17644]: I0319 12:02:03.019809 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:02:04.028082 master-0 kubenswrapper[17644]: I0319 12:02:04.028026 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"fa33151970d752ef2161babaa56491652362bb6f1d5e173d5390c7f59b36f27d"} Mar 19 12:02:04.342822 master-0 kubenswrapper[17644]: I0319 12:02:04.342679 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca\") pod \"console-operator-76b6568d85-8bvjj\" (UID: \"d2fd7597-cd7a-4138-bb3c-01681c569bd3\") " pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 12:02:04.343325 master-0 kubenswrapper[17644]: E0319 12:02:04.342945 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca podName:d2fd7597-cd7a-4138-bb3c-01681c569bd3 nodeName:}" failed. No retries permitted until 2026-03-19 12:04:06.342927861 +0000 UTC m=+280.112885896 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca") pod "console-operator-76b6568d85-8bvjj" (UID: "d2fd7597-cd7a-4138-bb3c-01681c569bd3") : configmap references non-existent config key: ca-bundle.crt Mar 19 12:02:05.507880 master-0 kubenswrapper[17644]: I0319 12:02:05.507807 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:02:05.507880 master-0 kubenswrapper[17644]: I0319 12:02:05.507888 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:02:05.513688 master-0 kubenswrapper[17644]: I0319 12:02:05.513637 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:02:05.786988 master-0 kubenswrapper[17644]: I0319 12:02:05.786848 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 12:02:08.034036 master-0 kubenswrapper[17644]: I0319 12:02:08.033992 17644 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:02:08.135153 master-0 kubenswrapper[17644]: I0319 12:02:08.135088 17644 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="d5f502b117c7c8479f7f20848a50fec0" podUID="00c0d353-2320-4015-a01e-c6f8d64c7954" Mar 19 12:02:09.055319 master-0 kubenswrapper[17644]: I0319 12:02:09.055244 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="5aa761d6-a121-4236-8841-c87d16450405" Mar 19 12:02:09.055319 master-0 kubenswrapper[17644]: I0319 12:02:09.055292 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="5aa761d6-a121-4236-8841-c87d16450405" Mar 19 12:02:09.059541 master-0 kubenswrapper[17644]: I0319 12:02:09.059490 17644 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="d5f502b117c7c8479f7f20848a50fec0" podUID="00c0d353-2320-4015-a01e-c6f8d64c7954" Mar 19 12:02:09.060526 master-0 kubenswrapper[17644]: I0319 12:02:09.060501 17644 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-master-0" containerID="cri-o://35db4f3a681d398c15eb2bb62e101d7bd88e8f0abee04fde1dc85180feca1660" Mar 19 12:02:09.060526 master-0 kubenswrapper[17644]: I0319 12:02:09.060526 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:02:10.064587 master-0 kubenswrapper[17644]: I0319 12:02:10.064513 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="5aa761d6-a121-4236-8841-c87d16450405" Mar 19 12:02:10.065260 master-0 kubenswrapper[17644]: I0319 12:02:10.065245 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="5aa761d6-a121-4236-8841-c87d16450405" Mar 19 12:02:10.067506 master-0 kubenswrapper[17644]: I0319 12:02:10.067483 17644 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="d5f502b117c7c8479f7f20848a50fec0" podUID="00c0d353-2320-4015-a01e-c6f8d64c7954" Mar 19 12:02:12.318292 master-0 kubenswrapper[17644]: I0319 12:02:12.318171 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 12:02:12.323778 master-0 kubenswrapper[17644]: I0319 12:02:12.323711 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 12:02:13.090297 master-0 kubenswrapper[17644]: I0319 12:02:13.090254 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 12:02:17.440146 master-0 kubenswrapper[17644]: I0319 12:02:17.440091 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 12:02:18.047155 master-0 kubenswrapper[17644]: I0319 12:02:18.047094 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 12:02:18.146604 master-0 kubenswrapper[17644]: I0319 12:02:18.146499 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 19 12:02:18.690993 master-0 kubenswrapper[17644]: I0319 12:02:18.690940 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 12:02:19.057892 master-0 kubenswrapper[17644]: I0319 12:02:19.057644 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 12:02:19.091698 master-0 kubenswrapper[17644]: I0319 12:02:19.091625 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 19 12:02:19.590614 master-0 kubenswrapper[17644]: I0319 12:02:19.590516 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 12:02:19.916793 master-0 kubenswrapper[17644]: I0319 12:02:19.916585 17644 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 12:02:20.141010 master-0 kubenswrapper[17644]: I0319 12:02:20.140896 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 12:02:20.186787 master-0 kubenswrapper[17644]: I0319 12:02:20.186681 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 12:02:20.205913 master-0 kubenswrapper[17644]: I0319 12:02:20.205843 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 12:02:20.262817 master-0 kubenswrapper[17644]: I0319 12:02:20.261620 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 12:02:20.263088 master-0 kubenswrapper[17644]: I0319 12:02:20.262140 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-cjb2h" Mar 19 12:02:20.263088 master-0 kubenswrapper[17644]: I0319 12:02:20.262251 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 19 12:02:20.296875 master-0 kubenswrapper[17644]: I0319 12:02:20.296824 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 19 12:02:20.404666 master-0 kubenswrapper[17644]: I0319 12:02:20.404598 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 19 12:02:20.481422 master-0 kubenswrapper[17644]: I0319 12:02:20.481263 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 19 12:02:20.520131 master-0 kubenswrapper[17644]: I0319 12:02:20.520046 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-w94s8" Mar 19 12:02:20.589129 master-0 kubenswrapper[17644]: I0319 12:02:20.589078 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-zf4zz" Mar 19 12:02:20.652202 master-0 kubenswrapper[17644]: I0319 12:02:20.652151 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 12:02:20.669301 master-0 kubenswrapper[17644]: I0319 12:02:20.669252 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-kn6lc" Mar 19 12:02:20.761159 master-0 kubenswrapper[17644]: I0319 12:02:20.761047 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-sx7wj" Mar 19 12:02:21.043706 master-0 kubenswrapper[17644]: I0319 12:02:21.043577 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 19 12:02:21.155768 master-0 kubenswrapper[17644]: I0319 12:02:21.155699 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 12:02:21.311516 master-0 kubenswrapper[17644]: I0319 12:02:21.311283 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 12:02:21.401683 master-0 kubenswrapper[17644]: I0319 12:02:21.401592 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 12:02:21.517584 master-0 kubenswrapper[17644]: I0319 12:02:21.517503 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 19 12:02:21.685329 master-0 kubenswrapper[17644]: I0319 12:02:21.685252 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 12:02:21.753174 master-0 kubenswrapper[17644]: I0319 12:02:21.753105 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 12:02:21.802694 master-0 kubenswrapper[17644]: I0319 12:02:21.802589 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 12:02:21.809069 master-0 kubenswrapper[17644]: I0319 12:02:21.809028 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 12:02:21.811192 master-0 kubenswrapper[17644]: I0319 12:02:21.811146 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 19 12:02:21.929089 master-0 kubenswrapper[17644]: I0319 12:02:21.928992 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 19 12:02:22.066908 master-0 kubenswrapper[17644]: I0319 12:02:22.066664 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-8h56m" Mar 19 12:02:22.102289 master-0 kubenswrapper[17644]: I0319 12:02:22.102224 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 12:02:22.109614 master-0 kubenswrapper[17644]: I0319 12:02:22.109570 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 12:02:22.279833 master-0 kubenswrapper[17644]: I0319 12:02:22.279757 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 19 12:02:22.431532 master-0 kubenswrapper[17644]: I0319 12:02:22.431370 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 19 12:02:22.580604 master-0 kubenswrapper[17644]: I0319 12:02:22.580531 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 12:02:22.635506 master-0 kubenswrapper[17644]: I0319 12:02:22.635406 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 12:02:22.674527 master-0 kubenswrapper[17644]: I0319 12:02:22.674448 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 12:02:22.706620 master-0 kubenswrapper[17644]: I0319 12:02:22.706544 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 12:02:22.718365 master-0 kubenswrapper[17644]: I0319 12:02:22.718267 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 12:02:22.739931 master-0 kubenswrapper[17644]: I0319 12:02:22.739830 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 12:02:22.754465 master-0 kubenswrapper[17644]: I0319 12:02:22.754382 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 12:02:22.760224 master-0 kubenswrapper[17644]: I0319 12:02:22.760182 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 12:02:22.765386 master-0 kubenswrapper[17644]: I0319 12:02:22.765316 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-qsmbf" Mar 19 12:02:22.781860 master-0 kubenswrapper[17644]: I0319 12:02:22.781793 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 12:02:22.880335 master-0 kubenswrapper[17644]: I0319 12:02:22.880266 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 19 12:02:22.936381 master-0 kubenswrapper[17644]: I0319 12:02:22.936288 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 12:02:22.950432 master-0 kubenswrapper[17644]: I0319 12:02:22.950345 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 12:02:23.044216 master-0 kubenswrapper[17644]: I0319 12:02:23.044026 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 12:02:23.044603 master-0 kubenswrapper[17644]: I0319 12:02:23.044509 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 12:02:23.079845 master-0 kubenswrapper[17644]: I0319 12:02:23.079777 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 12:02:23.141758 master-0 kubenswrapper[17644]: I0319 12:02:23.141648 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 12:02:23.162089 master-0 kubenswrapper[17644]: I0319 12:02:23.162022 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 12:02:23.226332 master-0 kubenswrapper[17644]: I0319 12:02:23.226252 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 12:02:23.240028 master-0 kubenswrapper[17644]: I0319 12:02:23.239968 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 12:02:23.322869 master-0 kubenswrapper[17644]: I0319 12:02:23.322698 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 19 12:02:23.330602 master-0 kubenswrapper[17644]: I0319 12:02:23.330554 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 12:02:23.350618 master-0 kubenswrapper[17644]: I0319 12:02:23.350548 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 12:02:23.358103 master-0 kubenswrapper[17644]: I0319 12:02:23.358050 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 19 12:02:23.434477 master-0 kubenswrapper[17644]: I0319 12:02:23.434386 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 12:02:23.461447 master-0 kubenswrapper[17644]: I0319 12:02:23.461370 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 12:02:23.559449 master-0 kubenswrapper[17644]: I0319 12:02:23.559331 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 12:02:23.592100 master-0 kubenswrapper[17644]: I0319 12:02:23.591893 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 12:02:23.594972 master-0 kubenswrapper[17644]: I0319 12:02:23.594927 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 12:02:23.608456 master-0 kubenswrapper[17644]: I0319 12:02:23.608422 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 12:02:23.623496 master-0 kubenswrapper[17644]: I0319 12:02:23.623449 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 19 12:02:23.653903 master-0 kubenswrapper[17644]: I0319 12:02:23.653830 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 12:02:23.653903 master-0 kubenswrapper[17644]: I0319 12:02:23.653829 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 12:02:23.676102 master-0 kubenswrapper[17644]: I0319 12:02:23.676050 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 19 12:02:23.678762 master-0 kubenswrapper[17644]: I0319 12:02:23.678721 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-5jj8d" Mar 19 12:02:23.684961 master-0 kubenswrapper[17644]: I0319 12:02:23.684930 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 12:02:23.712629 master-0 kubenswrapper[17644]: I0319 12:02:23.712566 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 19 12:02:23.790719 master-0 kubenswrapper[17644]: I0319 12:02:23.790654 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 12:02:23.895196 master-0 kubenswrapper[17644]: I0319 12:02:23.895080 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 12:02:23.897500 master-0 kubenswrapper[17644]: I0319 12:02:23.897467 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 12:02:23.949938 master-0 kubenswrapper[17644]: I0319 12:02:23.949890 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 12:02:23.997078 master-0 kubenswrapper[17644]: I0319 12:02:23.997023 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 12:02:24.075298 master-0 kubenswrapper[17644]: I0319 12:02:24.075238 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-c95p8" Mar 19 12:02:24.100923 master-0 kubenswrapper[17644]: I0319 12:02:24.100859 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 19 12:02:24.113154 master-0 kubenswrapper[17644]: I0319 12:02:24.113102 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 12:02:24.155676 master-0 kubenswrapper[17644]: I0319 12:02:24.155538 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 12:02:24.159952 master-0 kubenswrapper[17644]: I0319 12:02:24.159920 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 12:02:24.243097 master-0 kubenswrapper[17644]: I0319 12:02:24.243011 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 12:02:24.253756 master-0 kubenswrapper[17644]: I0319 12:02:24.253644 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 12:02:24.277764 master-0 kubenswrapper[17644]: I0319 12:02:24.277701 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 19 12:02:24.310549 master-0 kubenswrapper[17644]: I0319 12:02:24.310481 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 19 12:02:24.372060 master-0 kubenswrapper[17644]: I0319 12:02:24.371998 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 12:02:24.438178 master-0 kubenswrapper[17644]: I0319 12:02:24.438117 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 12:02:24.445598 master-0 kubenswrapper[17644]: I0319 12:02:24.445562 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 12:02:24.509367 master-0 kubenswrapper[17644]: I0319 12:02:24.509291 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 19 12:02:24.654492 master-0 kubenswrapper[17644]: I0319 12:02:24.654418 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 12:02:24.753330 master-0 kubenswrapper[17644]: I0319 12:02:24.753083 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 12:02:24.820628 master-0 kubenswrapper[17644]: I0319 12:02:24.820574 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 12:02:24.980329 master-0 kubenswrapper[17644]: I0319 12:02:24.980242 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 12:02:25.002673 master-0 kubenswrapper[17644]: I0319 12:02:25.002612 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 12:02:25.063581 master-0 kubenswrapper[17644]: I0319 12:02:25.063425 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 12:02:25.109553 master-0 kubenswrapper[17644]: I0319 12:02:25.109463 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-899lw" Mar 19 12:02:25.140405 master-0 kubenswrapper[17644]: I0319 12:02:25.140323 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-dnwcp" Mar 19 12:02:25.226801 master-0 kubenswrapper[17644]: I0319 12:02:25.226699 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-6qwsh" Mar 19 12:02:25.234917 master-0 kubenswrapper[17644]: I0319 12:02:25.234858 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 12:02:25.244298 master-0 kubenswrapper[17644]: I0319 12:02:25.244252 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 12:02:25.251825 master-0 kubenswrapper[17644]: I0319 12:02:25.251769 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 19 12:02:25.255611 master-0 kubenswrapper[17644]: I0319 12:02:25.255526 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 12:02:25.270599 master-0 kubenswrapper[17644]: I0319 12:02:25.270528 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 12:02:25.332668 master-0 kubenswrapper[17644]: I0319 12:02:25.332513 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 12:02:25.461793 master-0 kubenswrapper[17644]: I0319 12:02:25.461713 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 12:02:25.613248 master-0 kubenswrapper[17644]: I0319 12:02:25.613087 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 12:02:25.632759 master-0 kubenswrapper[17644]: I0319 12:02:25.632691 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 12:02:25.770079 master-0 kubenswrapper[17644]: I0319 12:02:25.770033 17644 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 12:02:25.853470 master-0 kubenswrapper[17644]: I0319 12:02:25.853398 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 12:02:25.949193 master-0 kubenswrapper[17644]: I0319 12:02:25.949134 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 19 12:02:25.982551 master-0 kubenswrapper[17644]: I0319 12:02:25.982428 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 12:02:25.990003 master-0 kubenswrapper[17644]: I0319 12:02:25.989935 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 19 12:02:26.043668 master-0 kubenswrapper[17644]: I0319 12:02:26.043302 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 12:02:26.068550 master-0 kubenswrapper[17644]: I0319 12:02:26.068207 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 12:02:26.170017 master-0 kubenswrapper[17644]: I0319 12:02:26.169923 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 12:02:26.180794 master-0 kubenswrapper[17644]: I0319 12:02:26.180730 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 12:02:26.231970 master-0 kubenswrapper[17644]: I0319 12:02:26.231791 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 12:02:26.240900 master-0 kubenswrapper[17644]: I0319 12:02:26.240847 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 12:02:26.319195 master-0 kubenswrapper[17644]: I0319 12:02:26.319125 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 12:02:26.359906 master-0 kubenswrapper[17644]: I0319 12:02:26.359721 17644 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 12:02:26.397156 master-0 kubenswrapper[17644]: I0319 12:02:26.396867 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 19 12:02:26.401849 master-0 kubenswrapper[17644]: I0319 12:02:26.401800 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 12:02:26.447608 master-0 kubenswrapper[17644]: I0319 12:02:26.447550 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 12:02:26.453719 master-0 kubenswrapper[17644]: I0319 12:02:26.453675 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 19 12:02:26.465952 master-0 kubenswrapper[17644]: I0319 12:02:26.465911 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 12:02:26.486153 master-0 kubenswrapper[17644]: I0319 12:02:26.486056 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 19 12:02:26.554208 master-0 kubenswrapper[17644]: I0319 12:02:26.554127 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 12:02:26.607038 master-0 kubenswrapper[17644]: I0319 12:02:26.606960 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 12:02:26.631283 master-0 kubenswrapper[17644]: I0319 12:02:26.631225 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 19 12:02:26.672424 master-0 kubenswrapper[17644]: I0319 12:02:26.672358 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 19 12:02:26.997475 master-0 kubenswrapper[17644]: I0319 12:02:26.997406 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 12:02:27.029004 master-0 kubenswrapper[17644]: I0319 12:02:27.028946 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 12:02:27.056763 master-0 kubenswrapper[17644]: I0319 12:02:27.056670 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 19 12:02:27.109806 master-0 kubenswrapper[17644]: I0319 12:02:27.109752 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 12:02:27.209487 master-0 kubenswrapper[17644]: I0319 12:02:27.209423 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 12:02:27.388999 master-0 kubenswrapper[17644]: I0319 12:02:27.388890 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 12:02:27.400475 master-0 kubenswrapper[17644]: I0319 12:02:27.400425 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-9hcb7" Mar 19 12:02:27.441950 master-0 kubenswrapper[17644]: I0319 12:02:27.441888 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 12:02:27.443231 master-0 kubenswrapper[17644]: I0319 12:02:27.443189 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 12:02:27.481500 master-0 kubenswrapper[17644]: I0319 12:02:27.481446 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 12:02:27.504021 master-0 kubenswrapper[17644]: I0319 12:02:27.503954 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 12:02:27.604198 master-0 kubenswrapper[17644]: I0319 12:02:27.604148 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 12:02:27.614808 master-0 kubenswrapper[17644]: I0319 12:02:27.614698 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 12:02:27.644548 master-0 kubenswrapper[17644]: I0319 12:02:27.644395 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 19 12:02:27.692780 master-0 kubenswrapper[17644]: I0319 12:02:27.692714 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 12:02:27.752092 master-0 kubenswrapper[17644]: I0319 12:02:27.752035 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 19 12:02:27.902167 master-0 kubenswrapper[17644]: I0319 12:02:27.902040 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 12:02:27.951076 master-0 kubenswrapper[17644]: I0319 12:02:27.950987 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 12:02:27.992357 master-0 kubenswrapper[17644]: I0319 12:02:27.992292 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 19 12:02:27.993846 master-0 kubenswrapper[17644]: I0319 12:02:27.993788 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 12:02:28.039893 master-0 kubenswrapper[17644]: I0319 12:02:28.039837 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 12:02:28.107704 master-0 kubenswrapper[17644]: I0319 12:02:28.107640 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 19 12:02:28.125603 master-0 kubenswrapper[17644]: I0319 12:02:28.125551 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 12:02:28.130499 master-0 kubenswrapper[17644]: I0319 12:02:28.130448 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 12:02:28.172179 master-0 kubenswrapper[17644]: I0319 12:02:28.172042 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-g72px" Mar 19 12:02:28.203260 master-0 kubenswrapper[17644]: I0319 12:02:28.203180 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 12:02:28.207404 master-0 kubenswrapper[17644]: I0319 12:02:28.207357 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 19 12:02:28.241148 master-0 kubenswrapper[17644]: I0319 12:02:28.241078 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-6ro5itlgu7nag" Mar 19 12:02:28.290956 master-0 kubenswrapper[17644]: I0319 12:02:28.290891 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 19 12:02:28.377618 master-0 kubenswrapper[17644]: I0319 12:02:28.377550 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-6gp54" Mar 19 12:02:28.645594 master-0 kubenswrapper[17644]: I0319 12:02:28.645540 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 12:02:28.715864 master-0 kubenswrapper[17644]: I0319 12:02:28.715402 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 19 12:02:28.716218 master-0 kubenswrapper[17644]: I0319 12:02:28.716190 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 19 12:02:28.747219 master-0 kubenswrapper[17644]: I0319 12:02:28.747158 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 12:02:28.786162 master-0 kubenswrapper[17644]: I0319 12:02:28.786064 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 12:02:28.801756 master-0 kubenswrapper[17644]: I0319 12:02:28.801707 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 12:02:28.837241 master-0 kubenswrapper[17644]: I0319 12:02:28.837200 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 19 12:02:28.903455 master-0 kubenswrapper[17644]: I0319 12:02:28.903300 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-sjgm7" Mar 19 12:02:28.943580 master-0 kubenswrapper[17644]: I0319 12:02:28.943528 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-68jgh" Mar 19 12:02:28.983357 master-0 kubenswrapper[17644]: I0319 12:02:28.982590 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 12:02:29.010158 master-0 kubenswrapper[17644]: I0319 12:02:29.010088 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 19 12:02:29.186682 master-0 kubenswrapper[17644]: I0319 12:02:29.186622 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 12:02:29.233346 master-0 kubenswrapper[17644]: I0319 12:02:29.233306 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 12:02:29.274787 master-0 kubenswrapper[17644]: I0319 12:02:29.274726 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 12:02:29.291813 master-0 kubenswrapper[17644]: I0319 12:02:29.291759 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 19 12:02:29.391496 master-0 kubenswrapper[17644]: I0319 12:02:29.391422 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 12:02:29.587518 master-0 kubenswrapper[17644]: I0319 12:02:29.587386 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 12:02:29.615002 master-0 kubenswrapper[17644]: I0319 12:02:29.614948 17644 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 12:02:29.620011 master-0 kubenswrapper[17644]: I0319 12:02:29.619961 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 12:02:29.620011 master-0 kubenswrapper[17644]: I0319 12:02:29.620014 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 12:02:29.624050 master-0 kubenswrapper[17644]: I0319 12:02:29.624004 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:02:29.639847 master-0 kubenswrapper[17644]: I0319 12:02:29.639736 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=21.639709593 podStartE2EDuration="21.639709593s" podCreationTimestamp="2026-03-19 12:02:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:02:29.636864621 +0000 UTC m=+183.406822666" watchObservedRunningTime="2026-03-19 12:02:29.639709593 +0000 UTC m=+183.409667638" Mar 19 12:02:29.641866 master-0 kubenswrapper[17644]: I0319 12:02:29.641828 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 12:02:29.671678 master-0 kubenswrapper[17644]: I0319 12:02:29.671625 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 12:02:29.699685 master-0 kubenswrapper[17644]: I0319 12:02:29.699642 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 12:02:29.712884 master-0 kubenswrapper[17644]: I0319 12:02:29.712843 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 12:02:29.722350 master-0 kubenswrapper[17644]: I0319 12:02:29.722302 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 12:02:29.810990 master-0 kubenswrapper[17644]: I0319 12:02:29.810951 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 19 12:02:29.840286 master-0 kubenswrapper[17644]: I0319 12:02:29.840158 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-7qzrj" Mar 19 12:02:29.852169 master-0 kubenswrapper[17644]: I0319 12:02:29.852131 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 12:02:29.939649 master-0 kubenswrapper[17644]: I0319 12:02:29.939607 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 12:02:29.980649 master-0 kubenswrapper[17644]: I0319 12:02:29.980598 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 12:02:29.989101 master-0 kubenswrapper[17644]: I0319 12:02:29.989071 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-75w76" Mar 19 12:02:30.049077 master-0 kubenswrapper[17644]: I0319 12:02:30.049010 17644 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 12:02:30.213917 master-0 kubenswrapper[17644]: I0319 12:02:30.213865 17644 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 12:02:30.214482 master-0 kubenswrapper[17644]: I0319 12:02:30.214451 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="85632c1cec8974aa874834e4cfff4c77" containerName="startup-monitor" containerID="cri-o://39dc59fe7a4082b0e522e88e6a942490b0f3386aa516717a966f593be1d80d96" gracePeriod=5 Mar 19 12:02:30.221986 master-0 kubenswrapper[17644]: I0319 12:02:30.221944 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-6zbld" Mar 19 12:02:30.227625 master-0 kubenswrapper[17644]: I0319 12:02:30.227600 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 12:02:30.278861 master-0 kubenswrapper[17644]: I0319 12:02:30.278827 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-2l456" Mar 19 12:02:30.292333 master-0 kubenswrapper[17644]: I0319 12:02:30.292311 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 12:02:30.346589 master-0 kubenswrapper[17644]: I0319 12:02:30.346549 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 12:02:30.400120 master-0 kubenswrapper[17644]: I0319 12:02:30.400086 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 12:02:30.428334 master-0 kubenswrapper[17644]: I0319 12:02:30.428275 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 19 12:02:30.522306 master-0 kubenswrapper[17644]: I0319 12:02:30.522193 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 12:02:30.557761 master-0 kubenswrapper[17644]: I0319 12:02:30.556859 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 12:02:30.561758 master-0 kubenswrapper[17644]: I0319 12:02:30.560331 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 12:02:30.631733 master-0 kubenswrapper[17644]: I0319 12:02:30.631679 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 19 12:02:30.651034 master-0 kubenswrapper[17644]: I0319 12:02:30.650998 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 19 12:02:30.682792 master-0 kubenswrapper[17644]: I0319 12:02:30.682721 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 19 12:02:30.754833 master-0 kubenswrapper[17644]: I0319 12:02:30.754768 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 12:02:30.784153 master-0 kubenswrapper[17644]: I0319 12:02:30.783958 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 12:02:30.809216 master-0 kubenswrapper[17644]: I0319 12:02:30.809152 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 12:02:30.811572 master-0 kubenswrapper[17644]: I0319 12:02:30.811546 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 12:02:30.830016 master-0 kubenswrapper[17644]: I0319 12:02:30.829973 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 12:02:30.921644 master-0 kubenswrapper[17644]: I0319 12:02:30.921601 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 12:02:30.927161 master-0 kubenswrapper[17644]: I0319 12:02:30.927127 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 12:02:31.005263 master-0 kubenswrapper[17644]: I0319 12:02:31.005209 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 12:02:31.101546 master-0 kubenswrapper[17644]: I0319 12:02:31.101415 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-7gswr" Mar 19 12:02:31.203777 master-0 kubenswrapper[17644]: I0319 12:02:31.203697 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 12:02:31.214256 master-0 kubenswrapper[17644]: I0319 12:02:31.214225 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-6nq75" Mar 19 12:02:31.288468 master-0 kubenswrapper[17644]: I0319 12:02:31.288396 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 12:02:31.429828 master-0 kubenswrapper[17644]: I0319 12:02:31.428935 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 12:02:31.500344 master-0 kubenswrapper[17644]: I0319 12:02:31.500291 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-8skrb" Mar 19 12:02:31.527532 master-0 kubenswrapper[17644]: I0319 12:02:31.527486 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 19 12:02:31.664733 master-0 kubenswrapper[17644]: I0319 12:02:31.664523 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 19 12:02:31.697334 master-0 kubenswrapper[17644]: I0319 12:02:31.697279 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 19 12:02:31.825332 master-0 kubenswrapper[17644]: I0319 12:02:31.825270 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 12:02:31.826868 master-0 kubenswrapper[17644]: I0319 12:02:31.826832 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 12:02:31.849768 master-0 kubenswrapper[17644]: I0319 12:02:31.849699 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 19 12:02:31.914926 master-0 kubenswrapper[17644]: I0319 12:02:31.914864 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-zmgxk" Mar 19 12:02:31.916781 master-0 kubenswrapper[17644]: I0319 12:02:31.916710 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 12:02:31.946397 master-0 kubenswrapper[17644]: I0319 12:02:31.946246 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 12:02:31.988518 master-0 kubenswrapper[17644]: I0319 12:02:31.988296 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-strbt" Mar 19 12:02:32.012408 master-0 kubenswrapper[17644]: I0319 12:02:32.012187 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 12:02:32.056833 master-0 kubenswrapper[17644]: I0319 12:02:32.056626 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 12:02:32.085803 master-0 kubenswrapper[17644]: I0319 12:02:32.085539 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 19 12:02:32.123323 master-0 kubenswrapper[17644]: I0319 12:02:32.123232 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 12:02:32.124153 master-0 kubenswrapper[17644]: I0319 12:02:32.124125 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 12:02:32.208025 master-0 kubenswrapper[17644]: I0319 12:02:32.207964 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-dfb6p" Mar 19 12:02:32.317976 master-0 kubenswrapper[17644]: I0319 12:02:32.317857 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 12:02:32.364939 master-0 kubenswrapper[17644]: I0319 12:02:32.364858 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 12:02:32.421164 master-0 kubenswrapper[17644]: I0319 12:02:32.421096 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 12:02:32.479386 master-0 kubenswrapper[17644]: I0319 12:02:32.479340 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 19 12:02:32.486935 master-0 kubenswrapper[17644]: I0319 12:02:32.486706 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-pdvk4" Mar 19 12:02:32.631272 master-0 kubenswrapper[17644]: I0319 12:02:32.631231 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 12:02:32.789031 master-0 kubenswrapper[17644]: I0319 12:02:32.788982 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 12:02:32.814016 master-0 kubenswrapper[17644]: I0319 12:02:32.813843 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 19 12:02:32.858844 master-0 kubenswrapper[17644]: I0319 12:02:32.858705 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 12:02:32.974784 master-0 kubenswrapper[17644]: I0319 12:02:32.974715 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 12:02:32.988987 master-0 kubenswrapper[17644]: I0319 12:02:32.988947 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 12:02:33.036436 master-0 kubenswrapper[17644]: I0319 12:02:33.036372 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 19 12:02:33.037166 master-0 kubenswrapper[17644]: I0319 12:02:33.037129 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 12:02:33.039858 master-0 kubenswrapper[17644]: I0319 12:02:33.039823 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 12:02:33.043427 master-0 kubenswrapper[17644]: I0319 12:02:33.043364 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 12:02:33.119874 master-0 kubenswrapper[17644]: I0319 12:02:33.119747 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 12:02:33.168549 master-0 kubenswrapper[17644]: I0319 12:02:33.168467 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 12:02:33.176256 master-0 kubenswrapper[17644]: I0319 12:02:33.176205 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 12:02:33.212721 master-0 kubenswrapper[17644]: I0319 12:02:33.212682 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 12:02:33.292530 master-0 kubenswrapper[17644]: I0319 12:02:33.292465 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 12:02:33.306961 master-0 kubenswrapper[17644]: I0319 12:02:33.306919 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 12:02:33.360081 master-0 kubenswrapper[17644]: I0319 12:02:33.360016 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 12:02:33.438082 master-0 kubenswrapper[17644]: I0319 12:02:33.438003 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 12:02:33.678025 master-0 kubenswrapper[17644]: I0319 12:02:33.677901 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 19 12:02:33.715286 master-0 kubenswrapper[17644]: I0319 12:02:33.715182 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 19 12:02:33.916252 master-0 kubenswrapper[17644]: I0319 12:02:33.916196 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 12:02:34.455350 master-0 kubenswrapper[17644]: I0319 12:02:34.455272 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 12:02:34.498265 master-0 kubenswrapper[17644]: I0319 12:02:34.498221 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 12:02:34.558566 master-0 kubenswrapper[17644]: I0319 12:02:34.558496 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 12:02:34.689698 master-0 kubenswrapper[17644]: I0319 12:02:34.689655 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 12:02:35.379464 master-0 kubenswrapper[17644]: I0319 12:02:35.379390 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 12:02:35.425485 master-0 kubenswrapper[17644]: I0319 12:02:35.425430 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 12:02:35.458131 master-0 kubenswrapper[17644]: I0319 12:02:35.458079 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 12:02:35.565999 master-0 kubenswrapper[17644]: I0319 12:02:35.565835 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 12:02:35.778037 master-0 kubenswrapper[17644]: I0319 12:02:35.778004 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_85632c1cec8974aa874834e4cfff4c77/startup-monitor/0.log" Mar 19 12:02:35.778295 master-0 kubenswrapper[17644]: I0319 12:02:35.778280 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:02:35.901712 master-0 kubenswrapper[17644]: I0319 12:02:35.901599 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-var-lock\") pod \"85632c1cec8974aa874834e4cfff4c77\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " Mar 19 12:02:35.901712 master-0 kubenswrapper[17644]: I0319 12:02:35.901668 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-resource-dir\") pod \"85632c1cec8974aa874834e4cfff4c77\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " Mar 19 12:02:35.901943 master-0 kubenswrapper[17644]: I0319 12:02:35.901675 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-var-lock" (OuterVolumeSpecName: "var-lock") pod "85632c1cec8974aa874834e4cfff4c77" (UID: "85632c1cec8974aa874834e4cfff4c77"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:02:35.901943 master-0 kubenswrapper[17644]: I0319 12:02:35.901710 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "85632c1cec8974aa874834e4cfff4c77" (UID: "85632c1cec8974aa874834e4cfff4c77"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:02:35.901943 master-0 kubenswrapper[17644]: I0319 12:02:35.901783 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-var-log\") pod \"85632c1cec8974aa874834e4cfff4c77\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " Mar 19 12:02:35.901943 master-0 kubenswrapper[17644]: I0319 12:02:35.901818 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-manifests\") pod \"85632c1cec8974aa874834e4cfff4c77\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " Mar 19 12:02:35.901943 master-0 kubenswrapper[17644]: I0319 12:02:35.901841 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-pod-resource-dir\") pod \"85632c1cec8974aa874834e4cfff4c77\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " Mar 19 12:02:35.901943 master-0 kubenswrapper[17644]: I0319 12:02:35.901883 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-var-log" (OuterVolumeSpecName: "var-log") pod "85632c1cec8974aa874834e4cfff4c77" (UID: "85632c1cec8974aa874834e4cfff4c77"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:02:35.901943 master-0 kubenswrapper[17644]: I0319 12:02:35.901914 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-manifests" (OuterVolumeSpecName: "manifests") pod "85632c1cec8974aa874834e4cfff4c77" (UID: "85632c1cec8974aa874834e4cfff4c77"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:02:35.902895 master-0 kubenswrapper[17644]: I0319 12:02:35.902230 17644 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:02:35.902895 master-0 kubenswrapper[17644]: I0319 12:02:35.902253 17644 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-var-log\") on node \"master-0\" DevicePath \"\"" Mar 19 12:02:35.902895 master-0 kubenswrapper[17644]: I0319 12:02:35.902265 17644 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-manifests\") on node \"master-0\" DevicePath \"\"" Mar 19 12:02:35.902895 master-0 kubenswrapper[17644]: I0319 12:02:35.902277 17644 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:02:35.907031 master-0 kubenswrapper[17644]: I0319 12:02:35.906978 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "85632c1cec8974aa874834e4cfff4c77" (UID: "85632c1cec8974aa874834e4cfff4c77"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:02:36.004570 master-0 kubenswrapper[17644]: I0319 12:02:36.004530 17644 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:02:36.237603 master-0 kubenswrapper[17644]: I0319 12:02:36.237542 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_85632c1cec8974aa874834e4cfff4c77/startup-monitor/0.log" Mar 19 12:02:36.238024 master-0 kubenswrapper[17644]: I0319 12:02:36.237993 17644 generic.go:334] "Generic (PLEG): container finished" podID="85632c1cec8974aa874834e4cfff4c77" containerID="39dc59fe7a4082b0e522e88e6a942490b0f3386aa516717a966f593be1d80d96" exitCode=137 Mar 19 12:02:36.238189 master-0 kubenswrapper[17644]: I0319 12:02:36.238132 17644 scope.go:117] "RemoveContainer" containerID="39dc59fe7a4082b0e522e88e6a942490b0f3386aa516717a966f593be1d80d96" Mar 19 12:02:36.238266 master-0 kubenswrapper[17644]: I0319 12:02:36.238193 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:02:36.253928 master-0 kubenswrapper[17644]: I0319 12:02:36.253865 17644 scope.go:117] "RemoveContainer" containerID="39dc59fe7a4082b0e522e88e6a942490b0f3386aa516717a966f593be1d80d96" Mar 19 12:02:36.254560 master-0 kubenswrapper[17644]: E0319 12:02:36.254524 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39dc59fe7a4082b0e522e88e6a942490b0f3386aa516717a966f593be1d80d96\": container with ID starting with 39dc59fe7a4082b0e522e88e6a942490b0f3386aa516717a966f593be1d80d96 not found: ID does not exist" containerID="39dc59fe7a4082b0e522e88e6a942490b0f3386aa516717a966f593be1d80d96" Mar 19 12:02:36.254638 master-0 kubenswrapper[17644]: I0319 12:02:36.254559 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39dc59fe7a4082b0e522e88e6a942490b0f3386aa516717a966f593be1d80d96"} err="failed to get container status \"39dc59fe7a4082b0e522e88e6a942490b0f3386aa516717a966f593be1d80d96\": rpc error: code = NotFound desc = could not find container \"39dc59fe7a4082b0e522e88e6a942490b0f3386aa516717a966f593be1d80d96\": container with ID starting with 39dc59fe7a4082b0e522e88e6a942490b0f3386aa516717a966f593be1d80d96 not found: ID does not exist" Mar 19 12:02:36.337618 master-0 kubenswrapper[17644]: I0319 12:02:36.337555 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 19 12:02:36.493568 master-0 kubenswrapper[17644]: I0319 12:02:36.493446 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85632c1cec8974aa874834e4cfff4c77" path="/var/lib/kubelet/pods/85632c1cec8974aa874834e4cfff4c77/volumes" Mar 19 12:02:53.338664 master-0 kubenswrapper[17644]: I0319 12:02:53.338528 17644 generic.go:334] "Generic (PLEG): container finished" podID="b3de8a1b-a5be-414f-86e8-738e16c8bc97" containerID="147ea002de1d61b828f3e4f59b89474a76a533a161c3a8b138665844ccc9c433" exitCode=0 Mar 19 12:02:53.339439 master-0 kubenswrapper[17644]: I0319 12:02:53.338635 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" event={"ID":"b3de8a1b-a5be-414f-86e8-738e16c8bc97","Type":"ContainerDied","Data":"147ea002de1d61b828f3e4f59b89474a76a533a161c3a8b138665844ccc9c433"} Mar 19 12:02:53.339618 master-0 kubenswrapper[17644]: I0319 12:02:53.339596 17644 scope.go:117] "RemoveContainer" containerID="ecca7c744f565812652616c950bf4c3ba074defb48c439f60ea10ec59b205e80" Mar 19 12:02:53.340322 master-0 kubenswrapper[17644]: I0319 12:02:53.340272 17644 scope.go:117] "RemoveContainer" containerID="147ea002de1d61b828f3e4f59b89474a76a533a161c3a8b138665844ccc9c433" Mar 19 12:02:54.345482 master-0 kubenswrapper[17644]: I0319 12:02:54.345423 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" event={"ID":"b3de8a1b-a5be-414f-86e8-738e16c8bc97","Type":"ContainerStarted","Data":"78a6cddfc0e0acdf16c683d2f70de891e291d18f727a26ee57b67fdf44168c74"} Mar 19 12:02:54.346761 master-0 kubenswrapper[17644]: I0319 12:02:54.346715 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 12:02:55.134977 master-0 kubenswrapper[17644]: I0319 12:02:55.134923 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 12:02:55.505130 master-0 kubenswrapper[17644]: I0319 12:02:55.505046 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-548bb99f44-txbjj"] Mar 19 12:02:55.505709 master-0 kubenswrapper[17644]: I0319 12:02:55.505416 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" podUID="76cf2b01-33d9-47eb-be5d-44946c78bf20" containerName="controller-manager" containerID="cri-o://f86e998e3bf1f66c4662c3a819ee87d4b1804f14a9851d9972eed9aee129f60b" gracePeriod=30 Mar 19 12:02:55.587207 master-0 kubenswrapper[17644]: I0319 12:02:55.587086 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd"] Mar 19 12:02:55.587779 master-0 kubenswrapper[17644]: I0319 12:02:55.587454 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" podUID="e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85" containerName="route-controller-manager" containerID="cri-o://f95e4fbff564a46ec14c0cd042e81bebe47f9478a757f230a7655159821666eb" gracePeriod=30 Mar 19 12:02:56.031021 master-0 kubenswrapper[17644]: I0319 12:02:56.030315 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-3-retry-1-master-0"] Mar 19 12:02:56.031021 master-0 kubenswrapper[17644]: E0319 12:02:56.030716 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92" containerName="installer" Mar 19 12:02:56.031021 master-0 kubenswrapper[17644]: I0319 12:02:56.030749 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92" containerName="installer" Mar 19 12:02:56.031021 master-0 kubenswrapper[17644]: E0319 12:02:56.030772 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85632c1cec8974aa874834e4cfff4c77" containerName="startup-monitor" Mar 19 12:02:56.031021 master-0 kubenswrapper[17644]: I0319 12:02:56.030781 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="85632c1cec8974aa874834e4cfff4c77" containerName="startup-monitor" Mar 19 12:02:56.031021 master-0 kubenswrapper[17644]: I0319 12:02:56.030937 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="85632c1cec8974aa874834e4cfff4c77" containerName="startup-monitor" Mar 19 12:02:56.031021 master-0 kubenswrapper[17644]: I0319 12:02:56.030959 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92" containerName="installer" Mar 19 12:02:56.031838 master-0 kubenswrapper[17644]: I0319 12:02:56.031639 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-retry-1-master-0" Mar 19 12:02:56.034802 master-0 kubenswrapper[17644]: I0319 12:02:56.034763 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-4vwst" Mar 19 12:02:56.035038 master-0 kubenswrapper[17644]: I0319 12:02:56.035022 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 19 12:02:56.055083 master-0 kubenswrapper[17644]: I0319 12:02:56.055015 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-retry-1-master-0"] Mar 19 12:02:56.075291 master-0 kubenswrapper[17644]: I0319 12:02:56.075223 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 12:02:56.077029 master-0 kubenswrapper[17644]: I0319 12:02:56.076983 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 12:02:56.095953 master-0 kubenswrapper[17644]: I0319 12:02:56.095896 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8980acc0-b2b5-4e44-9b8e-f7086f5a46bb-kube-api-access\") pod \"installer-3-retry-1-master-0\" (UID: \"8980acc0-b2b5-4e44-9b8e-f7086f5a46bb\") " pod="openshift-kube-scheduler/installer-3-retry-1-master-0" Mar 19 12:02:56.095953 master-0 kubenswrapper[17644]: I0319 12:02:56.095965 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8980acc0-b2b5-4e44-9b8e-f7086f5a46bb-var-lock\") pod \"installer-3-retry-1-master-0\" (UID: \"8980acc0-b2b5-4e44-9b8e-f7086f5a46bb\") " pod="openshift-kube-scheduler/installer-3-retry-1-master-0" Mar 19 12:02:56.096359 master-0 kubenswrapper[17644]: I0319 12:02:56.096033 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8980acc0-b2b5-4e44-9b8e-f7086f5a46bb-kubelet-dir\") pod \"installer-3-retry-1-master-0\" (UID: \"8980acc0-b2b5-4e44-9b8e-f7086f5a46bb\") " pod="openshift-kube-scheduler/installer-3-retry-1-master-0" Mar 19 12:02:56.197600 master-0 kubenswrapper[17644]: I0319 12:02:56.197511 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj527\" (UniqueName: \"kubernetes.io/projected/76cf2b01-33d9-47eb-be5d-44946c78bf20-kube-api-access-nj527\") pod \"76cf2b01-33d9-47eb-be5d-44946c78bf20\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " Mar 19 12:02:56.197914 master-0 kubenswrapper[17644]: I0319 12:02:56.197619 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-client-ca\") pod \"76cf2b01-33d9-47eb-be5d-44946c78bf20\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " Mar 19 12:02:56.197914 master-0 kubenswrapper[17644]: I0319 12:02:56.197673 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76cf2b01-33d9-47eb-be5d-44946c78bf20-serving-cert\") pod \"76cf2b01-33d9-47eb-be5d-44946c78bf20\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " Mar 19 12:02:56.197914 master-0 kubenswrapper[17644]: I0319 12:02:56.197721 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-config\") pod \"76cf2b01-33d9-47eb-be5d-44946c78bf20\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " Mar 19 12:02:56.197914 master-0 kubenswrapper[17644]: I0319 12:02:56.197843 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5skx\" (UniqueName: \"kubernetes.io/projected/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-kube-api-access-n5skx\") pod \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " Mar 19 12:02:56.197914 master-0 kubenswrapper[17644]: I0319 12:02:56.197898 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-client-ca\") pod \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " Mar 19 12:02:56.198067 master-0 kubenswrapper[17644]: I0319 12:02:56.197948 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-config\") pod \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " Mar 19 12:02:56.198067 master-0 kubenswrapper[17644]: I0319 12:02:56.197981 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-serving-cert\") pod \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\" (UID: \"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85\") " Mar 19 12:02:56.198067 master-0 kubenswrapper[17644]: I0319 12:02:56.198050 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-proxy-ca-bundles\") pod \"76cf2b01-33d9-47eb-be5d-44946c78bf20\" (UID: \"76cf2b01-33d9-47eb-be5d-44946c78bf20\") " Mar 19 12:02:56.198816 master-0 kubenswrapper[17644]: I0319 12:02:56.198433 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8980acc0-b2b5-4e44-9b8e-f7086f5a46bb-kube-api-access\") pod \"installer-3-retry-1-master-0\" (UID: \"8980acc0-b2b5-4e44-9b8e-f7086f5a46bb\") " pod="openshift-kube-scheduler/installer-3-retry-1-master-0" Mar 19 12:02:56.198816 master-0 kubenswrapper[17644]: I0319 12:02:56.198451 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-client-ca" (OuterVolumeSpecName: "client-ca") pod "76cf2b01-33d9-47eb-be5d-44946c78bf20" (UID: "76cf2b01-33d9-47eb-be5d-44946c78bf20"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:02:56.198816 master-0 kubenswrapper[17644]: I0319 12:02:56.198510 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8980acc0-b2b5-4e44-9b8e-f7086f5a46bb-var-lock\") pod \"installer-3-retry-1-master-0\" (UID: \"8980acc0-b2b5-4e44-9b8e-f7086f5a46bb\") " pod="openshift-kube-scheduler/installer-3-retry-1-master-0" Mar 19 12:02:56.198816 master-0 kubenswrapper[17644]: I0319 12:02:56.198544 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8980acc0-b2b5-4e44-9b8e-f7086f5a46bb-kubelet-dir\") pod \"installer-3-retry-1-master-0\" (UID: \"8980acc0-b2b5-4e44-9b8e-f7086f5a46bb\") " pod="openshift-kube-scheduler/installer-3-retry-1-master-0" Mar 19 12:02:56.198816 master-0 kubenswrapper[17644]: I0319 12:02:56.198675 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8980acc0-b2b5-4e44-9b8e-f7086f5a46bb-kubelet-dir\") pod \"installer-3-retry-1-master-0\" (UID: \"8980acc0-b2b5-4e44-9b8e-f7086f5a46bb\") " pod="openshift-kube-scheduler/installer-3-retry-1-master-0" Mar 19 12:02:56.198816 master-0 kubenswrapper[17644]: I0319 12:02:56.198807 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8980acc0-b2b5-4e44-9b8e-f7086f5a46bb-var-lock\") pod \"installer-3-retry-1-master-0\" (UID: \"8980acc0-b2b5-4e44-9b8e-f7086f5a46bb\") " pod="openshift-kube-scheduler/installer-3-retry-1-master-0" Mar 19 12:02:56.198816 master-0 kubenswrapper[17644]: I0319 12:02:56.198818 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-config" (OuterVolumeSpecName: "config") pod "76cf2b01-33d9-47eb-be5d-44946c78bf20" (UID: "76cf2b01-33d9-47eb-be5d-44946c78bf20"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:02:56.199256 master-0 kubenswrapper[17644]: I0319 12:02:56.198835 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-client-ca" (OuterVolumeSpecName: "client-ca") pod "e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85" (UID: "e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:02:56.199256 master-0 kubenswrapper[17644]: I0319 12:02:56.199162 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "76cf2b01-33d9-47eb-be5d-44946c78bf20" (UID: "76cf2b01-33d9-47eb-be5d-44946c78bf20"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:02:56.199363 master-0 kubenswrapper[17644]: I0319 12:02:56.199259 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-config" (OuterVolumeSpecName: "config") pod "e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85" (UID: "e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:02:56.201543 master-0 kubenswrapper[17644]: I0319 12:02:56.201352 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76cf2b01-33d9-47eb-be5d-44946c78bf20-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "76cf2b01-33d9-47eb-be5d-44946c78bf20" (UID: "76cf2b01-33d9-47eb-be5d-44946c78bf20"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:02:56.201543 master-0 kubenswrapper[17644]: I0319 12:02:56.201490 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85" (UID: "e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:02:56.202065 master-0 kubenswrapper[17644]: I0319 12:02:56.202028 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-kube-api-access-n5skx" (OuterVolumeSpecName: "kube-api-access-n5skx") pod "e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85" (UID: "e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85"). InnerVolumeSpecName "kube-api-access-n5skx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:02:56.203948 master-0 kubenswrapper[17644]: I0319 12:02:56.203917 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76cf2b01-33d9-47eb-be5d-44946c78bf20-kube-api-access-nj527" (OuterVolumeSpecName: "kube-api-access-nj527") pod "76cf2b01-33d9-47eb-be5d-44946c78bf20" (UID: "76cf2b01-33d9-47eb-be5d-44946c78bf20"). InnerVolumeSpecName "kube-api-access-nj527". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:02:56.216998 master-0 kubenswrapper[17644]: I0319 12:02:56.216918 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8980acc0-b2b5-4e44-9b8e-f7086f5a46bb-kube-api-access\") pod \"installer-3-retry-1-master-0\" (UID: \"8980acc0-b2b5-4e44-9b8e-f7086f5a46bb\") " pod="openshift-kube-scheduler/installer-3-retry-1-master-0" Mar 19 12:02:56.300580 master-0 kubenswrapper[17644]: I0319 12:02:56.300411 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj527\" (UniqueName: \"kubernetes.io/projected/76cf2b01-33d9-47eb-be5d-44946c78bf20-kube-api-access-nj527\") on node \"master-0\" DevicePath \"\"" Mar 19 12:02:56.300580 master-0 kubenswrapper[17644]: I0319 12:02:56.300472 17644 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 12:02:56.300580 master-0 kubenswrapper[17644]: I0319 12:02:56.300499 17644 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76cf2b01-33d9-47eb-be5d-44946c78bf20-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:02:56.300580 master-0 kubenswrapper[17644]: I0319 12:02:56.300514 17644 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:02:56.300580 master-0 kubenswrapper[17644]: I0319 12:02:56.300526 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5skx\" (UniqueName: \"kubernetes.io/projected/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-kube-api-access-n5skx\") on node \"master-0\" DevicePath \"\"" Mar 19 12:02:56.300580 master-0 kubenswrapper[17644]: I0319 12:02:56.300535 17644 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 12:02:56.300580 master-0 kubenswrapper[17644]: I0319 12:02:56.300546 17644 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:02:56.300580 master-0 kubenswrapper[17644]: I0319 12:02:56.300554 17644 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:02:56.300580 master-0 kubenswrapper[17644]: I0319 12:02:56.300563 17644 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76cf2b01-33d9-47eb-be5d-44946c78bf20-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 19 12:02:56.367555 master-0 kubenswrapper[17644]: I0319 12:02:56.367509 17644 generic.go:334] "Generic (PLEG): container finished" podID="e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85" containerID="f95e4fbff564a46ec14c0cd042e81bebe47f9478a757f230a7655159821666eb" exitCode=0 Mar 19 12:02:56.368264 master-0 kubenswrapper[17644]: I0319 12:02:56.367594 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" event={"ID":"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85","Type":"ContainerDied","Data":"f95e4fbff564a46ec14c0cd042e81bebe47f9478a757f230a7655159821666eb"} Mar 19 12:02:56.368264 master-0 kubenswrapper[17644]: I0319 12:02:56.367705 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" event={"ID":"e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85","Type":"ContainerDied","Data":"c0337cf9dcdc7cc749cac3adad0f44d0d5457a466ca84750f37317d1eb4a70f1"} Mar 19 12:02:56.368264 master-0 kubenswrapper[17644]: I0319 12:02:56.367795 17644 scope.go:117] "RemoveContainer" containerID="f95e4fbff564a46ec14c0cd042e81bebe47f9478a757f230a7655159821666eb" Mar 19 12:02:56.368264 master-0 kubenswrapper[17644]: I0319 12:02:56.367891 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd" Mar 19 12:02:56.372593 master-0 kubenswrapper[17644]: I0319 12:02:56.372559 17644 generic.go:334] "Generic (PLEG): container finished" podID="76cf2b01-33d9-47eb-be5d-44946c78bf20" containerID="f86e998e3bf1f66c4662c3a819ee87d4b1804f14a9851d9972eed9aee129f60b" exitCode=0 Mar 19 12:02:56.372691 master-0 kubenswrapper[17644]: I0319 12:02:56.372608 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" event={"ID":"76cf2b01-33d9-47eb-be5d-44946c78bf20","Type":"ContainerDied","Data":"f86e998e3bf1f66c4662c3a819ee87d4b1804f14a9851d9972eed9aee129f60b"} Mar 19 12:02:56.372691 master-0 kubenswrapper[17644]: I0319 12:02:56.372673 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" event={"ID":"76cf2b01-33d9-47eb-be5d-44946c78bf20","Type":"ContainerDied","Data":"eab7f63dc5326173ea1e6327285462aa6a81c9b141ac54e3d2487017aec7ef32"} Mar 19 12:02:56.372691 master-0 kubenswrapper[17644]: I0319 12:02:56.372624 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-548bb99f44-txbjj" Mar 19 12:02:56.384363 master-0 kubenswrapper[17644]: I0319 12:02:56.384180 17644 scope.go:117] "RemoveContainer" containerID="f95e4fbff564a46ec14c0cd042e81bebe47f9478a757f230a7655159821666eb" Mar 19 12:02:56.384777 master-0 kubenswrapper[17644]: E0319 12:02:56.384739 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f95e4fbff564a46ec14c0cd042e81bebe47f9478a757f230a7655159821666eb\": container with ID starting with f95e4fbff564a46ec14c0cd042e81bebe47f9478a757f230a7655159821666eb not found: ID does not exist" containerID="f95e4fbff564a46ec14c0cd042e81bebe47f9478a757f230a7655159821666eb" Mar 19 12:02:56.384846 master-0 kubenswrapper[17644]: I0319 12:02:56.384785 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f95e4fbff564a46ec14c0cd042e81bebe47f9478a757f230a7655159821666eb"} err="failed to get container status \"f95e4fbff564a46ec14c0cd042e81bebe47f9478a757f230a7655159821666eb\": rpc error: code = NotFound desc = could not find container \"f95e4fbff564a46ec14c0cd042e81bebe47f9478a757f230a7655159821666eb\": container with ID starting with f95e4fbff564a46ec14c0cd042e81bebe47f9478a757f230a7655159821666eb not found: ID does not exist" Mar 19 12:02:56.384846 master-0 kubenswrapper[17644]: I0319 12:02:56.384813 17644 scope.go:117] "RemoveContainer" containerID="f86e998e3bf1f66c4662c3a819ee87d4b1804f14a9851d9972eed9aee129f60b" Mar 19 12:02:56.395292 master-0 kubenswrapper[17644]: I0319 12:02:56.395247 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-retry-1-master-0" Mar 19 12:02:56.408070 master-0 kubenswrapper[17644]: I0319 12:02:56.407982 17644 scope.go:117] "RemoveContainer" containerID="87cbd1b5cfb2e78754584648c786a0ccf511cf3452d3bed2f55e931cc6e6e1b5" Mar 19 12:02:56.409837 master-0 kubenswrapper[17644]: I0319 12:02:56.409446 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd"] Mar 19 12:02:56.414217 master-0 kubenswrapper[17644]: I0319 12:02:56.414177 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-864f875b6b-rcjvd"] Mar 19 12:02:56.426009 master-0 kubenswrapper[17644]: I0319 12:02:56.425957 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-548bb99f44-txbjj"] Mar 19 12:02:56.427345 master-0 kubenswrapper[17644]: I0319 12:02:56.427320 17644 scope.go:117] "RemoveContainer" containerID="f86e998e3bf1f66c4662c3a819ee87d4b1804f14a9851d9972eed9aee129f60b" Mar 19 12:02:56.428061 master-0 kubenswrapper[17644]: E0319 12:02:56.428032 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f86e998e3bf1f66c4662c3a819ee87d4b1804f14a9851d9972eed9aee129f60b\": container with ID starting with f86e998e3bf1f66c4662c3a819ee87d4b1804f14a9851d9972eed9aee129f60b not found: ID does not exist" containerID="f86e998e3bf1f66c4662c3a819ee87d4b1804f14a9851d9972eed9aee129f60b" Mar 19 12:02:56.428149 master-0 kubenswrapper[17644]: I0319 12:02:56.428076 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f86e998e3bf1f66c4662c3a819ee87d4b1804f14a9851d9972eed9aee129f60b"} err="failed to get container status \"f86e998e3bf1f66c4662c3a819ee87d4b1804f14a9851d9972eed9aee129f60b\": rpc error: code = NotFound desc = could not find container \"f86e998e3bf1f66c4662c3a819ee87d4b1804f14a9851d9972eed9aee129f60b\": container with ID starting with f86e998e3bf1f66c4662c3a819ee87d4b1804f14a9851d9972eed9aee129f60b not found: ID does not exist" Mar 19 12:02:56.428149 master-0 kubenswrapper[17644]: I0319 12:02:56.428106 17644 scope.go:117] "RemoveContainer" containerID="87cbd1b5cfb2e78754584648c786a0ccf511cf3452d3bed2f55e931cc6e6e1b5" Mar 19 12:02:56.428574 master-0 kubenswrapper[17644]: E0319 12:02:56.428549 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87cbd1b5cfb2e78754584648c786a0ccf511cf3452d3bed2f55e931cc6e6e1b5\": container with ID starting with 87cbd1b5cfb2e78754584648c786a0ccf511cf3452d3bed2f55e931cc6e6e1b5 not found: ID does not exist" containerID="87cbd1b5cfb2e78754584648c786a0ccf511cf3452d3bed2f55e931cc6e6e1b5" Mar 19 12:02:56.428622 master-0 kubenswrapper[17644]: I0319 12:02:56.428575 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87cbd1b5cfb2e78754584648c786a0ccf511cf3452d3bed2f55e931cc6e6e1b5"} err="failed to get container status \"87cbd1b5cfb2e78754584648c786a0ccf511cf3452d3bed2f55e931cc6e6e1b5\": rpc error: code = NotFound desc = could not find container \"87cbd1b5cfb2e78754584648c786a0ccf511cf3452d3bed2f55e931cc6e6e1b5\": container with ID starting with 87cbd1b5cfb2e78754584648c786a0ccf511cf3452d3bed2f55e931cc6e6e1b5 not found: ID does not exist" Mar 19 12:02:56.431755 master-0 kubenswrapper[17644]: I0319 12:02:56.431638 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-548bb99f44-txbjj"] Mar 19 12:02:56.491078 master-0 kubenswrapper[17644]: I0319 12:02:56.491017 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76cf2b01-33d9-47eb-be5d-44946c78bf20" path="/var/lib/kubelet/pods/76cf2b01-33d9-47eb-be5d-44946c78bf20/volumes" Mar 19 12:02:56.491675 master-0 kubenswrapper[17644]: I0319 12:02:56.491648 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85" path="/var/lib/kubelet/pods/e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85/volumes" Mar 19 12:02:56.845815 master-0 kubenswrapper[17644]: I0319 12:02:56.845698 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-retry-1-master-0"] Mar 19 12:02:56.847710 master-0 kubenswrapper[17644]: W0319 12:02:56.847643 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8980acc0_b2b5_4e44_9b8e_f7086f5a46bb.slice/crio-baa93e1ac5ad966d619d2678bfcbe0c4d3217e4fc5d349b4bbb3bf791724cdf5 WatchSource:0}: Error finding container baa93e1ac5ad966d619d2678bfcbe0c4d3217e4fc5d349b4bbb3bf791724cdf5: Status 404 returned error can't find the container with id baa93e1ac5ad966d619d2678bfcbe0c4d3217e4fc5d349b4bbb3bf791724cdf5 Mar 19 12:02:57.131777 master-0 kubenswrapper[17644]: I0319 12:02:57.131697 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b485796d4-dqrfs"] Mar 19 12:02:57.136148 master-0 kubenswrapper[17644]: E0319 12:02:57.132127 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76cf2b01-33d9-47eb-be5d-44946c78bf20" containerName="controller-manager" Mar 19 12:02:57.136148 master-0 kubenswrapper[17644]: I0319 12:02:57.132142 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="76cf2b01-33d9-47eb-be5d-44946c78bf20" containerName="controller-manager" Mar 19 12:02:57.136148 master-0 kubenswrapper[17644]: E0319 12:02:57.132156 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85" containerName="route-controller-manager" Mar 19 12:02:57.136148 master-0 kubenswrapper[17644]: I0319 12:02:57.132162 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85" containerName="route-controller-manager" Mar 19 12:02:57.136148 master-0 kubenswrapper[17644]: E0319 12:02:57.132171 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76cf2b01-33d9-47eb-be5d-44946c78bf20" containerName="controller-manager" Mar 19 12:02:57.136148 master-0 kubenswrapper[17644]: I0319 12:02:57.132177 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="76cf2b01-33d9-47eb-be5d-44946c78bf20" containerName="controller-manager" Mar 19 12:02:57.136148 master-0 kubenswrapper[17644]: I0319 12:02:57.132296 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5c0bb87-0d65-4d7c-9ddd-a4889f0ebb85" containerName="route-controller-manager" Mar 19 12:02:57.136148 master-0 kubenswrapper[17644]: I0319 12:02:57.132338 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="76cf2b01-33d9-47eb-be5d-44946c78bf20" containerName="controller-manager" Mar 19 12:02:57.136148 master-0 kubenswrapper[17644]: I0319 12:02:57.132346 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="76cf2b01-33d9-47eb-be5d-44946c78bf20" containerName="controller-manager" Mar 19 12:02:57.136148 master-0 kubenswrapper[17644]: I0319 12:02:57.132890 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" Mar 19 12:02:57.140330 master-0 kubenswrapper[17644]: I0319 12:02:57.138295 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 12:02:57.140330 master-0 kubenswrapper[17644]: I0319 12:02:57.138561 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 12:02:57.140330 master-0 kubenswrapper[17644]: I0319 12:02:57.138692 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 12:02:57.140330 master-0 kubenswrapper[17644]: I0319 12:02:57.138848 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-mv7dz" Mar 19 12:02:57.140330 master-0 kubenswrapper[17644]: I0319 12:02:57.138977 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 12:02:57.140330 master-0 kubenswrapper[17644]: I0319 12:02:57.140128 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 12:02:57.140330 master-0 kubenswrapper[17644]: I0319 12:02:57.140283 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7b59c69f-ltvsg"] Mar 19 12:02:57.142401 master-0 kubenswrapper[17644]: I0319 12:02:57.142346 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7b59c69f-ltvsg" Mar 19 12:02:57.144658 master-0 kubenswrapper[17644]: I0319 12:02:57.144609 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 12:02:57.145267 master-0 kubenswrapper[17644]: I0319 12:02:57.145210 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 12:02:57.145344 master-0 kubenswrapper[17644]: I0319 12:02:57.145239 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 12:02:57.145455 master-0 kubenswrapper[17644]: I0319 12:02:57.145431 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 12:02:57.145607 master-0 kubenswrapper[17644]: I0319 12:02:57.145586 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-qsmbf" Mar 19 12:02:57.145700 master-0 kubenswrapper[17644]: I0319 12:02:57.145680 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 12:02:57.145927 master-0 kubenswrapper[17644]: I0319 12:02:57.145676 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 12:02:57.152382 master-0 kubenswrapper[17644]: I0319 12:02:57.152332 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b485796d4-dqrfs"] Mar 19 12:02:57.163560 master-0 kubenswrapper[17644]: I0319 12:02:57.162751 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7b59c69f-ltvsg"] Mar 19 12:02:57.212936 master-0 kubenswrapper[17644]: I0319 12:02:57.212886 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97f5b7e8-eee9-42b1-a23e-8a74f1ce4585-client-ca\") pod \"controller-manager-b485796d4-dqrfs\" (UID: \"97f5b7e8-eee9-42b1-a23e-8a74f1ce4585\") " pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" Mar 19 12:02:57.213162 master-0 kubenswrapper[17644]: I0319 12:02:57.212943 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnrg4\" (UniqueName: \"kubernetes.io/projected/fae580a3-42d7-4ee9-ac9c-b747e350f8f8-kube-api-access-bnrg4\") pod \"route-controller-manager-5b7b59c69f-ltvsg\" (UID: \"fae580a3-42d7-4ee9-ac9c-b747e350f8f8\") " pod="openshift-route-controller-manager/route-controller-manager-5b7b59c69f-ltvsg" Mar 19 12:02:57.213162 master-0 kubenswrapper[17644]: I0319 12:02:57.212971 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fae580a3-42d7-4ee9-ac9c-b747e350f8f8-serving-cert\") pod \"route-controller-manager-5b7b59c69f-ltvsg\" (UID: \"fae580a3-42d7-4ee9-ac9c-b747e350f8f8\") " pod="openshift-route-controller-manager/route-controller-manager-5b7b59c69f-ltvsg" Mar 19 12:02:57.213162 master-0 kubenswrapper[17644]: I0319 12:02:57.213025 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97f5b7e8-eee9-42b1-a23e-8a74f1ce4585-serving-cert\") pod \"controller-manager-b485796d4-dqrfs\" (UID: \"97f5b7e8-eee9-42b1-a23e-8a74f1ce4585\") " pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" Mar 19 12:02:57.213162 master-0 kubenswrapper[17644]: I0319 12:02:57.213104 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97f5b7e8-eee9-42b1-a23e-8a74f1ce4585-proxy-ca-bundles\") pod \"controller-manager-b485796d4-dqrfs\" (UID: \"97f5b7e8-eee9-42b1-a23e-8a74f1ce4585\") " pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" Mar 19 12:02:57.213162 master-0 kubenswrapper[17644]: I0319 12:02:57.213144 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t5hm\" (UniqueName: \"kubernetes.io/projected/97f5b7e8-eee9-42b1-a23e-8a74f1ce4585-kube-api-access-5t5hm\") pod \"controller-manager-b485796d4-dqrfs\" (UID: \"97f5b7e8-eee9-42b1-a23e-8a74f1ce4585\") " pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" Mar 19 12:02:57.213326 master-0 kubenswrapper[17644]: I0319 12:02:57.213174 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97f5b7e8-eee9-42b1-a23e-8a74f1ce4585-config\") pod \"controller-manager-b485796d4-dqrfs\" (UID: \"97f5b7e8-eee9-42b1-a23e-8a74f1ce4585\") " pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" Mar 19 12:02:57.213326 master-0 kubenswrapper[17644]: I0319 12:02:57.213196 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fae580a3-42d7-4ee9-ac9c-b747e350f8f8-config\") pod \"route-controller-manager-5b7b59c69f-ltvsg\" (UID: \"fae580a3-42d7-4ee9-ac9c-b747e350f8f8\") " pod="openshift-route-controller-manager/route-controller-manager-5b7b59c69f-ltvsg" Mar 19 12:02:57.213326 master-0 kubenswrapper[17644]: I0319 12:02:57.213250 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fae580a3-42d7-4ee9-ac9c-b747e350f8f8-client-ca\") pod \"route-controller-manager-5b7b59c69f-ltvsg\" (UID: \"fae580a3-42d7-4ee9-ac9c-b747e350f8f8\") " pod="openshift-route-controller-manager/route-controller-manager-5b7b59c69f-ltvsg" Mar 19 12:02:57.314986 master-0 kubenswrapper[17644]: I0319 12:02:57.314922 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97f5b7e8-eee9-42b1-a23e-8a74f1ce4585-serving-cert\") pod \"controller-manager-b485796d4-dqrfs\" (UID: \"97f5b7e8-eee9-42b1-a23e-8a74f1ce4585\") " pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" Mar 19 12:02:57.314986 master-0 kubenswrapper[17644]: I0319 12:02:57.315000 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97f5b7e8-eee9-42b1-a23e-8a74f1ce4585-proxy-ca-bundles\") pod \"controller-manager-b485796d4-dqrfs\" (UID: \"97f5b7e8-eee9-42b1-a23e-8a74f1ce4585\") " pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" Mar 19 12:02:57.315286 master-0 kubenswrapper[17644]: I0319 12:02:57.315251 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t5hm\" (UniqueName: \"kubernetes.io/projected/97f5b7e8-eee9-42b1-a23e-8a74f1ce4585-kube-api-access-5t5hm\") pod \"controller-manager-b485796d4-dqrfs\" (UID: \"97f5b7e8-eee9-42b1-a23e-8a74f1ce4585\") " pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" Mar 19 12:02:57.315392 master-0 kubenswrapper[17644]: I0319 12:02:57.315371 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fae580a3-42d7-4ee9-ac9c-b747e350f8f8-config\") pod \"route-controller-manager-5b7b59c69f-ltvsg\" (UID: \"fae580a3-42d7-4ee9-ac9c-b747e350f8f8\") " pod="openshift-route-controller-manager/route-controller-manager-5b7b59c69f-ltvsg" Mar 19 12:02:57.315459 master-0 kubenswrapper[17644]: I0319 12:02:57.315399 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97f5b7e8-eee9-42b1-a23e-8a74f1ce4585-config\") pod \"controller-manager-b485796d4-dqrfs\" (UID: \"97f5b7e8-eee9-42b1-a23e-8a74f1ce4585\") " pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" Mar 19 12:02:57.315459 master-0 kubenswrapper[17644]: I0319 12:02:57.315453 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fae580a3-42d7-4ee9-ac9c-b747e350f8f8-client-ca\") pod \"route-controller-manager-5b7b59c69f-ltvsg\" (UID: \"fae580a3-42d7-4ee9-ac9c-b747e350f8f8\") " pod="openshift-route-controller-manager/route-controller-manager-5b7b59c69f-ltvsg" Mar 19 12:02:57.315522 master-0 kubenswrapper[17644]: I0319 12:02:57.315514 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnrg4\" (UniqueName: \"kubernetes.io/projected/fae580a3-42d7-4ee9-ac9c-b747e350f8f8-kube-api-access-bnrg4\") pod \"route-controller-manager-5b7b59c69f-ltvsg\" (UID: \"fae580a3-42d7-4ee9-ac9c-b747e350f8f8\") " pod="openshift-route-controller-manager/route-controller-manager-5b7b59c69f-ltvsg" Mar 19 12:02:57.315551 master-0 kubenswrapper[17644]: I0319 12:02:57.315531 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97f5b7e8-eee9-42b1-a23e-8a74f1ce4585-client-ca\") pod \"controller-manager-b485796d4-dqrfs\" (UID: \"97f5b7e8-eee9-42b1-a23e-8a74f1ce4585\") " pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" Mar 19 12:02:57.315583 master-0 kubenswrapper[17644]: I0319 12:02:57.315563 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fae580a3-42d7-4ee9-ac9c-b747e350f8f8-serving-cert\") pod \"route-controller-manager-5b7b59c69f-ltvsg\" (UID: \"fae580a3-42d7-4ee9-ac9c-b747e350f8f8\") " pod="openshift-route-controller-manager/route-controller-manager-5b7b59c69f-ltvsg" Mar 19 12:02:57.317185 master-0 kubenswrapper[17644]: I0319 12:02:57.316389 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97f5b7e8-eee9-42b1-a23e-8a74f1ce4585-proxy-ca-bundles\") pod \"controller-manager-b485796d4-dqrfs\" (UID: \"97f5b7e8-eee9-42b1-a23e-8a74f1ce4585\") " pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" Mar 19 12:02:57.317185 master-0 kubenswrapper[17644]: I0319 12:02:57.316417 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fae580a3-42d7-4ee9-ac9c-b747e350f8f8-client-ca\") pod \"route-controller-manager-5b7b59c69f-ltvsg\" (UID: \"fae580a3-42d7-4ee9-ac9c-b747e350f8f8\") " pod="openshift-route-controller-manager/route-controller-manager-5b7b59c69f-ltvsg" Mar 19 12:02:57.317427 master-0 kubenswrapper[17644]: I0319 12:02:57.317246 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97f5b7e8-eee9-42b1-a23e-8a74f1ce4585-client-ca\") pod \"controller-manager-b485796d4-dqrfs\" (UID: \"97f5b7e8-eee9-42b1-a23e-8a74f1ce4585\") " pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" Mar 19 12:02:57.317427 master-0 kubenswrapper[17644]: I0319 12:02:57.317269 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fae580a3-42d7-4ee9-ac9c-b747e350f8f8-config\") pod \"route-controller-manager-5b7b59c69f-ltvsg\" (UID: \"fae580a3-42d7-4ee9-ac9c-b747e350f8f8\") " pod="openshift-route-controller-manager/route-controller-manager-5b7b59c69f-ltvsg" Mar 19 12:02:57.318389 master-0 kubenswrapper[17644]: I0319 12:02:57.318358 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97f5b7e8-eee9-42b1-a23e-8a74f1ce4585-config\") pod \"controller-manager-b485796d4-dqrfs\" (UID: \"97f5b7e8-eee9-42b1-a23e-8a74f1ce4585\") " pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" Mar 19 12:02:57.318389 master-0 kubenswrapper[17644]: I0319 12:02:57.318377 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97f5b7e8-eee9-42b1-a23e-8a74f1ce4585-serving-cert\") pod \"controller-manager-b485796d4-dqrfs\" (UID: \"97f5b7e8-eee9-42b1-a23e-8a74f1ce4585\") " pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" Mar 19 12:02:57.327796 master-0 kubenswrapper[17644]: I0319 12:02:57.327768 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fae580a3-42d7-4ee9-ac9c-b747e350f8f8-serving-cert\") pod \"route-controller-manager-5b7b59c69f-ltvsg\" (UID: \"fae580a3-42d7-4ee9-ac9c-b747e350f8f8\") " pod="openshift-route-controller-manager/route-controller-manager-5b7b59c69f-ltvsg" Mar 19 12:02:57.330757 master-0 kubenswrapper[17644]: I0319 12:02:57.330689 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t5hm\" (UniqueName: \"kubernetes.io/projected/97f5b7e8-eee9-42b1-a23e-8a74f1ce4585-kube-api-access-5t5hm\") pod \"controller-manager-b485796d4-dqrfs\" (UID: \"97f5b7e8-eee9-42b1-a23e-8a74f1ce4585\") " pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" Mar 19 12:02:57.330945 master-0 kubenswrapper[17644]: I0319 12:02:57.330917 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnrg4\" (UniqueName: \"kubernetes.io/projected/fae580a3-42d7-4ee9-ac9c-b747e350f8f8-kube-api-access-bnrg4\") pod \"route-controller-manager-5b7b59c69f-ltvsg\" (UID: \"fae580a3-42d7-4ee9-ac9c-b747e350f8f8\") " pod="openshift-route-controller-manager/route-controller-manager-5b7b59c69f-ltvsg" Mar 19 12:02:57.381222 master-0 kubenswrapper[17644]: I0319 12:02:57.381108 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-retry-1-master-0" event={"ID":"8980acc0-b2b5-4e44-9b8e-f7086f5a46bb","Type":"ContainerStarted","Data":"5a04fcbf5bd0dc08054acc532f8682de34a7af91b9add42d602003c290fead83"} Mar 19 12:02:57.381437 master-0 kubenswrapper[17644]: I0319 12:02:57.381423 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-retry-1-master-0" event={"ID":"8980acc0-b2b5-4e44-9b8e-f7086f5a46bb","Type":"ContainerStarted","Data":"baa93e1ac5ad966d619d2678bfcbe0c4d3217e4fc5d349b4bbb3bf791724cdf5"} Mar 19 12:02:57.480473 master-0 kubenswrapper[17644]: I0319 12:02:57.480407 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" Mar 19 12:02:57.496593 master-0 kubenswrapper[17644]: I0319 12:02:57.496534 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7b59c69f-ltvsg" Mar 19 12:02:58.004213 master-0 kubenswrapper[17644]: I0319 12:02:58.003672 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-3-retry-1-master-0" podStartSLOduration=2.003629633 podStartE2EDuration="2.003629633s" podCreationTimestamp="2026-03-19 12:02:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:02:57.399333987 +0000 UTC m=+211.169292062" watchObservedRunningTime="2026-03-19 12:02:58.003629633 +0000 UTC m=+211.773587688" Mar 19 12:02:58.012815 master-0 kubenswrapper[17644]: W0319 12:02:58.008714 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97f5b7e8_eee9_42b1_a23e_8a74f1ce4585.slice/crio-fb2dbe67e769d4b1055ec824a7a9ee5d34fa3da44409b22211ba4bf94a5a70ca WatchSource:0}: Error finding container fb2dbe67e769d4b1055ec824a7a9ee5d34fa3da44409b22211ba4bf94a5a70ca: Status 404 returned error can't find the container with id fb2dbe67e769d4b1055ec824a7a9ee5d34fa3da44409b22211ba4bf94a5a70ca Mar 19 12:02:58.015318 master-0 kubenswrapper[17644]: I0319 12:02:58.015271 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b485796d4-dqrfs"] Mar 19 12:02:58.061371 master-0 kubenswrapper[17644]: I0319 12:02:58.058869 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7b59c69f-ltvsg"] Mar 19 12:02:58.391166 master-0 kubenswrapper[17644]: I0319 12:02:58.390905 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" event={"ID":"97f5b7e8-eee9-42b1-a23e-8a74f1ce4585","Type":"ContainerStarted","Data":"6ed56431e7e3a29594e8c55d24af97e05dc53fc52776fe94fedb9d579e864bcd"} Mar 19 12:02:58.391166 master-0 kubenswrapper[17644]: I0319 12:02:58.390966 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" event={"ID":"97f5b7e8-eee9-42b1-a23e-8a74f1ce4585","Type":"ContainerStarted","Data":"fb2dbe67e769d4b1055ec824a7a9ee5d34fa3da44409b22211ba4bf94a5a70ca"} Mar 19 12:02:58.392049 master-0 kubenswrapper[17644]: I0319 12:02:58.392007 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7b59c69f-ltvsg" event={"ID":"fae580a3-42d7-4ee9-ac9c-b747e350f8f8","Type":"ContainerStarted","Data":"a72c1dcb7706b9dd9a9a555cbabbd5e6a1be4a7dc9e2216ad1e6aa64c026ba99"} Mar 19 12:02:59.406258 master-0 kubenswrapper[17644]: I0319 12:02:59.405970 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7b59c69f-ltvsg" event={"ID":"fae580a3-42d7-4ee9-ac9c-b747e350f8f8","Type":"ContainerStarted","Data":"26f8560b0493948596d0a0a025d19625d4f8effa30d70c8d5d003276b8a73778"} Mar 19 12:02:59.406870 master-0 kubenswrapper[17644]: I0319 12:02:59.406277 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" Mar 19 12:02:59.406870 master-0 kubenswrapper[17644]: I0319 12:02:59.406495 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b7b59c69f-ltvsg" Mar 19 12:02:59.411825 master-0 kubenswrapper[17644]: I0319 12:02:59.411772 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" Mar 19 12:02:59.415134 master-0 kubenswrapper[17644]: I0319 12:02:59.415085 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b7b59c69f-ltvsg" Mar 19 12:02:59.429151 master-0 kubenswrapper[17644]: I0319 12:02:59.428281 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" podStartSLOduration=4.428260158 podStartE2EDuration="4.428260158s" podCreationTimestamp="2026-03-19 12:02:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:02:59.422899315 +0000 UTC m=+213.192857360" watchObservedRunningTime="2026-03-19 12:02:59.428260158 +0000 UTC m=+213.198218193" Mar 19 12:02:59.466993 master-0 kubenswrapper[17644]: I0319 12:02:59.466864 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b7b59c69f-ltvsg" podStartSLOduration=4.466839235 podStartE2EDuration="4.466839235s" podCreationTimestamp="2026-03-19 12:02:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:02:59.461698867 +0000 UTC m=+213.231656922" watchObservedRunningTime="2026-03-19 12:02:59.466839235 +0000 UTC m=+213.236797270" Mar 19 12:03:06.494949 master-0 kubenswrapper[17644]: I0319 12:03:06.494823 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-ksvww"] Mar 19 12:03:06.495644 master-0 kubenswrapper[17644]: I0319 12:03:06.495619 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ksvww" Mar 19 12:03:06.498181 master-0 kubenswrapper[17644]: I0319 12:03:06.498147 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-b27xz" Mar 19 12:03:06.499058 master-0 kubenswrapper[17644]: I0319 12:03:06.498996 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 12:03:06.640191 master-0 kubenswrapper[17644]: I0319 12:03:06.640123 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf8f5cd4-3f6a-43f6-bdbe-9fc79f015851-host\") pod \"node-ca-ksvww\" (UID: \"cf8f5cd4-3f6a-43f6-bdbe-9fc79f015851\") " pod="openshift-image-registry/node-ca-ksvww" Mar 19 12:03:06.640191 master-0 kubenswrapper[17644]: I0319 12:03:06.640204 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtslr\" (UniqueName: \"kubernetes.io/projected/cf8f5cd4-3f6a-43f6-bdbe-9fc79f015851-kube-api-access-gtslr\") pod \"node-ca-ksvww\" (UID: \"cf8f5cd4-3f6a-43f6-bdbe-9fc79f015851\") " pod="openshift-image-registry/node-ca-ksvww" Mar 19 12:03:06.640527 master-0 kubenswrapper[17644]: I0319 12:03:06.640279 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cf8f5cd4-3f6a-43f6-bdbe-9fc79f015851-serviceca\") pod \"node-ca-ksvww\" (UID: \"cf8f5cd4-3f6a-43f6-bdbe-9fc79f015851\") " pod="openshift-image-registry/node-ca-ksvww" Mar 19 12:03:06.741743 master-0 kubenswrapper[17644]: I0319 12:03:06.741631 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf8f5cd4-3f6a-43f6-bdbe-9fc79f015851-host\") pod \"node-ca-ksvww\" (UID: \"cf8f5cd4-3f6a-43f6-bdbe-9fc79f015851\") " pod="openshift-image-registry/node-ca-ksvww" Mar 19 12:03:06.741967 master-0 kubenswrapper[17644]: I0319 12:03:06.741753 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtslr\" (UniqueName: \"kubernetes.io/projected/cf8f5cd4-3f6a-43f6-bdbe-9fc79f015851-kube-api-access-gtslr\") pod \"node-ca-ksvww\" (UID: \"cf8f5cd4-3f6a-43f6-bdbe-9fc79f015851\") " pod="openshift-image-registry/node-ca-ksvww" Mar 19 12:03:06.741967 master-0 kubenswrapper[17644]: I0319 12:03:06.741766 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf8f5cd4-3f6a-43f6-bdbe-9fc79f015851-host\") pod \"node-ca-ksvww\" (UID: \"cf8f5cd4-3f6a-43f6-bdbe-9fc79f015851\") " pod="openshift-image-registry/node-ca-ksvww" Mar 19 12:03:06.741967 master-0 kubenswrapper[17644]: I0319 12:03:06.741813 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cf8f5cd4-3f6a-43f6-bdbe-9fc79f015851-serviceca\") pod \"node-ca-ksvww\" (UID: \"cf8f5cd4-3f6a-43f6-bdbe-9fc79f015851\") " pod="openshift-image-registry/node-ca-ksvww" Mar 19 12:03:06.742515 master-0 kubenswrapper[17644]: I0319 12:03:06.742419 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cf8f5cd4-3f6a-43f6-bdbe-9fc79f015851-serviceca\") pod \"node-ca-ksvww\" (UID: \"cf8f5cd4-3f6a-43f6-bdbe-9fc79f015851\") " pod="openshift-image-registry/node-ca-ksvww" Mar 19 12:03:06.756229 master-0 kubenswrapper[17644]: I0319 12:03:06.756115 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtslr\" (UniqueName: \"kubernetes.io/projected/cf8f5cd4-3f6a-43f6-bdbe-9fc79f015851-kube-api-access-gtslr\") pod \"node-ca-ksvww\" (UID: \"cf8f5cd4-3f6a-43f6-bdbe-9fc79f015851\") " pod="openshift-image-registry/node-ca-ksvww" Mar 19 12:03:06.814539 master-0 kubenswrapper[17644]: I0319 12:03:06.814458 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ksvww" Mar 19 12:03:06.831237 master-0 kubenswrapper[17644]: W0319 12:03:06.831151 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf8f5cd4_3f6a_43f6_bdbe_9fc79f015851.slice/crio-f70d405073e5ec9349d36053fdff40a006831201bda35ed20ddcde919e624550 WatchSource:0}: Error finding container f70d405073e5ec9349d36053fdff40a006831201bda35ed20ddcde919e624550: Status 404 returned error can't find the container with id f70d405073e5ec9349d36053fdff40a006831201bda35ed20ddcde919e624550 Mar 19 12:03:07.454644 master-0 kubenswrapper[17644]: I0319 12:03:07.454577 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ksvww" event={"ID":"cf8f5cd4-3f6a-43f6-bdbe-9fc79f015851","Type":"ContainerStarted","Data":"f70d405073e5ec9349d36053fdff40a006831201bda35ed20ddcde919e624550"} Mar 19 12:03:09.470547 master-0 kubenswrapper[17644]: I0319 12:03:09.470472 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ksvww" event={"ID":"cf8f5cd4-3f6a-43f6-bdbe-9fc79f015851","Type":"ContainerStarted","Data":"7636a113f96bd102189753fea2be89bb699a0f9028168c7333fc526dff8f1b13"} Mar 19 12:03:09.489135 master-0 kubenswrapper[17644]: I0319 12:03:09.489037 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ksvww" podStartSLOduration=1.469900609 podStartE2EDuration="3.489017257s" podCreationTimestamp="2026-03-19 12:03:06 +0000 UTC" firstStartedPulling="2026-03-19 12:03:06.832747239 +0000 UTC m=+220.602705284" lastFinishedPulling="2026-03-19 12:03:08.851863897 +0000 UTC m=+222.621821932" observedRunningTime="2026-03-19 12:03:09.485482459 +0000 UTC m=+223.255440514" watchObservedRunningTime="2026-03-19 12:03:09.489017257 +0000 UTC m=+223.258975302" Mar 19 12:03:28.154220 master-0 kubenswrapper[17644]: I0319 12:03:28.154138 17644 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 19 12:03:28.154810 master-0 kubenswrapper[17644]: I0319 12:03:28.154376 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" containerID="cri-o://ed2f9594479b46ec9f3ffcb6affe24cb0f9c8f73bdb2153419d01fd69d9d7cd6" gracePeriod=30 Mar 19 12:03:28.155744 master-0 kubenswrapper[17644]: I0319 12:03:28.155667 17644 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 12:03:28.156291 master-0 kubenswrapper[17644]: E0319 12:03:28.156255 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 12:03:28.156291 master-0 kubenswrapper[17644]: I0319 12:03:28.156280 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 12:03:28.156377 master-0 kubenswrapper[17644]: E0319 12:03:28.156299 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 12:03:28.156377 master-0 kubenswrapper[17644]: I0319 12:03:28.156308 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 12:03:28.156527 master-0 kubenswrapper[17644]: I0319 12:03:28.156501 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 12:03:28.156579 master-0 kubenswrapper[17644]: I0319 12:03:28.156546 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 12:03:28.156713 master-0 kubenswrapper[17644]: E0319 12:03:28.156682 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 12:03:28.156713 master-0 kubenswrapper[17644]: I0319 12:03:28.156701 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 12:03:28.156933 master-0 kubenswrapper[17644]: I0319 12:03:28.156903 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 12:03:28.158100 master-0 kubenswrapper[17644]: I0319 12:03:28.158066 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:03:28.180317 master-0 kubenswrapper[17644]: I0319 12:03:28.180283 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8dd3d3608fe9c86b0f65904ec2353df4-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8dd3d3608fe9c86b0f65904ec2353df4\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:03:28.180425 master-0 kubenswrapper[17644]: I0319 12:03:28.180362 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8dd3d3608fe9c86b0f65904ec2353df4-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8dd3d3608fe9c86b0f65904ec2353df4\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:03:28.196292 master-0 kubenswrapper[17644]: I0319 12:03:28.196213 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 12:03:28.281037 master-0 kubenswrapper[17644]: I0319 12:03:28.281007 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8dd3d3608fe9c86b0f65904ec2353df4-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8dd3d3608fe9c86b0f65904ec2353df4\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:03:28.281190 master-0 kubenswrapper[17644]: I0319 12:03:28.281154 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8dd3d3608fe9c86b0f65904ec2353df4-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8dd3d3608fe9c86b0f65904ec2353df4\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:03:28.281701 master-0 kubenswrapper[17644]: I0319 12:03:28.281670 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8dd3d3608fe9c86b0f65904ec2353df4-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8dd3d3608fe9c86b0f65904ec2353df4\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:03:28.281954 master-0 kubenswrapper[17644]: I0319 12:03:28.281918 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8dd3d3608fe9c86b0f65904ec2353df4-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8dd3d3608fe9c86b0f65904ec2353df4\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:03:28.322448 master-0 kubenswrapper[17644]: I0319 12:03:28.322382 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 12:03:28.344769 master-0 kubenswrapper[17644]: I0319 12:03:28.341751 17644 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="0bd9f597-2384-49a3-a928-15186ab72145" Mar 19 12:03:28.406331 master-0 kubenswrapper[17644]: E0319 12:03:28.406212 17644 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-pod8980acc0_b2b5_4e44_9b8e_f7086f5a46bb.slice/crio-5a04fcbf5bd0dc08054acc532f8682de34a7af91b9add42d602003c290fead83.scope\": RecentStats: unable to find data in memory cache]" Mar 19 12:03:28.483232 master-0 kubenswrapper[17644]: I0319 12:03:28.483176 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"c83737980b9ee109184b1d78e942cf36\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " Mar 19 12:03:28.483494 master-0 kubenswrapper[17644]: I0319 12:03:28.483337 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"c83737980b9ee109184b1d78e942cf36\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " Mar 19 12:03:28.483494 master-0 kubenswrapper[17644]: I0319 12:03:28.483366 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets" (OuterVolumeSpecName: "secrets") pod "c83737980b9ee109184b1d78e942cf36" (UID: "c83737980b9ee109184b1d78e942cf36"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:03:28.483494 master-0 kubenswrapper[17644]: I0319 12:03:28.483486 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs" (OuterVolumeSpecName: "logs") pod "c83737980b9ee109184b1d78e942cf36" (UID: "c83737980b9ee109184b1d78e942cf36"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:03:28.483764 master-0 kubenswrapper[17644]: I0319 12:03:28.483714 17644 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:03:28.483932 master-0 kubenswrapper[17644]: I0319 12:03:28.483908 17644 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") on node \"master-0\" DevicePath \"\"" Mar 19 12:03:28.492434 master-0 kubenswrapper[17644]: I0319 12:03:28.492381 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:03:28.494096 master-0 kubenswrapper[17644]: I0319 12:03:28.493291 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c83737980b9ee109184b1d78e942cf36" path="/var/lib/kubelet/pods/c83737980b9ee109184b1d78e942cf36/volumes" Mar 19 12:03:28.494096 master-0 kubenswrapper[17644]: I0319 12:03:28.493566 17644 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="" Mar 19 12:03:28.507288 master-0 kubenswrapper[17644]: I0319 12:03:28.507233 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 19 12:03:28.507518 master-0 kubenswrapper[17644]: I0319 12:03:28.507302 17644 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="0bd9f597-2384-49a3-a928-15186ab72145" Mar 19 12:03:28.516208 master-0 kubenswrapper[17644]: I0319 12:03:28.514914 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 19 12:03:28.516208 master-0 kubenswrapper[17644]: I0319 12:03:28.514982 17644 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="0bd9f597-2384-49a3-a928-15186ab72145" Mar 19 12:03:28.520158 master-0 kubenswrapper[17644]: W0319 12:03:28.520116 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dd3d3608fe9c86b0f65904ec2353df4.slice/crio-7d88bfca2ccb92d5242d0c98389226aa2b5da433c6f4854d5f56212b76a2f857 WatchSource:0}: Error finding container 7d88bfca2ccb92d5242d0c98389226aa2b5da433c6f4854d5f56212b76a2f857: Status 404 returned error can't find the container with id 7d88bfca2ccb92d5242d0c98389226aa2b5da433c6f4854d5f56212b76a2f857 Mar 19 12:03:28.615600 master-0 kubenswrapper[17644]: I0319 12:03:28.615541 17644 generic.go:334] "Generic (PLEG): container finished" podID="8980acc0-b2b5-4e44-9b8e-f7086f5a46bb" containerID="5a04fcbf5bd0dc08054acc532f8682de34a7af91b9add42d602003c290fead83" exitCode=0 Mar 19 12:03:28.615873 master-0 kubenswrapper[17644]: I0319 12:03:28.615604 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-retry-1-master-0" event={"ID":"8980acc0-b2b5-4e44-9b8e-f7086f5a46bb","Type":"ContainerDied","Data":"5a04fcbf5bd0dc08054acc532f8682de34a7af91b9add42d602003c290fead83"} Mar 19 12:03:28.618253 master-0 kubenswrapper[17644]: I0319 12:03:28.618224 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8dd3d3608fe9c86b0f65904ec2353df4","Type":"ContainerStarted","Data":"7d88bfca2ccb92d5242d0c98389226aa2b5da433c6f4854d5f56212b76a2f857"} Mar 19 12:03:28.623079 master-0 kubenswrapper[17644]: I0319 12:03:28.620241 17644 generic.go:334] "Generic (PLEG): container finished" podID="c83737980b9ee109184b1d78e942cf36" containerID="ed2f9594479b46ec9f3ffcb6affe24cb0f9c8f73bdb2153419d01fd69d9d7cd6" exitCode=0 Mar 19 12:03:28.623079 master-0 kubenswrapper[17644]: I0319 12:03:28.620276 17644 scope.go:117] "RemoveContainer" containerID="ed2f9594479b46ec9f3ffcb6affe24cb0f9c8f73bdb2153419d01fd69d9d7cd6" Mar 19 12:03:28.623079 master-0 kubenswrapper[17644]: I0319 12:03:28.620349 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 12:03:28.642706 master-0 kubenswrapper[17644]: I0319 12:03:28.642678 17644 scope.go:117] "RemoveContainer" containerID="a24c957c2955f33fcac616e1dace18be5248f20b6e9d2c791c70c17f3df96825" Mar 19 12:03:28.662031 master-0 kubenswrapper[17644]: I0319 12:03:28.661949 17644 scope.go:117] "RemoveContainer" containerID="ed2f9594479b46ec9f3ffcb6affe24cb0f9c8f73bdb2153419d01fd69d9d7cd6" Mar 19 12:03:28.663638 master-0 kubenswrapper[17644]: E0319 12:03:28.663591 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed2f9594479b46ec9f3ffcb6affe24cb0f9c8f73bdb2153419d01fd69d9d7cd6\": container with ID starting with ed2f9594479b46ec9f3ffcb6affe24cb0f9c8f73bdb2153419d01fd69d9d7cd6 not found: ID does not exist" containerID="ed2f9594479b46ec9f3ffcb6affe24cb0f9c8f73bdb2153419d01fd69d9d7cd6" Mar 19 12:03:28.663711 master-0 kubenswrapper[17644]: I0319 12:03:28.663669 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed2f9594479b46ec9f3ffcb6affe24cb0f9c8f73bdb2153419d01fd69d9d7cd6"} err="failed to get container status \"ed2f9594479b46ec9f3ffcb6affe24cb0f9c8f73bdb2153419d01fd69d9d7cd6\": rpc error: code = NotFound desc = could not find container \"ed2f9594479b46ec9f3ffcb6affe24cb0f9c8f73bdb2153419d01fd69d9d7cd6\": container with ID starting with ed2f9594479b46ec9f3ffcb6affe24cb0f9c8f73bdb2153419d01fd69d9d7cd6 not found: ID does not exist" Mar 19 12:03:28.663797 master-0 kubenswrapper[17644]: I0319 12:03:28.663699 17644 scope.go:117] "RemoveContainer" containerID="a24c957c2955f33fcac616e1dace18be5248f20b6e9d2c791c70c17f3df96825" Mar 19 12:03:28.664343 master-0 kubenswrapper[17644]: E0319 12:03:28.664293 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a24c957c2955f33fcac616e1dace18be5248f20b6e9d2c791c70c17f3df96825\": container with ID starting with a24c957c2955f33fcac616e1dace18be5248f20b6e9d2c791c70c17f3df96825 not found: ID does not exist" containerID="a24c957c2955f33fcac616e1dace18be5248f20b6e9d2c791c70c17f3df96825" Mar 19 12:03:28.664402 master-0 kubenswrapper[17644]: I0319 12:03:28.664345 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a24c957c2955f33fcac616e1dace18be5248f20b6e9d2c791c70c17f3df96825"} err="failed to get container status \"a24c957c2955f33fcac616e1dace18be5248f20b6e9d2c791c70c17f3df96825\": rpc error: code = NotFound desc = could not find container \"a24c957c2955f33fcac616e1dace18be5248f20b6e9d2c791c70c17f3df96825\": container with ID starting with a24c957c2955f33fcac616e1dace18be5248f20b6e9d2c791c70c17f3df96825 not found: ID does not exist" Mar 19 12:03:29.631074 master-0 kubenswrapper[17644]: I0319 12:03:29.631006 17644 generic.go:334] "Generic (PLEG): container finished" podID="8dd3d3608fe9c86b0f65904ec2353df4" containerID="acfd1ce3103a5fa37df0519179075aa6fc37e5626660c372ef45915da61d7d3c" exitCode=0 Mar 19 12:03:29.631806 master-0 kubenswrapper[17644]: I0319 12:03:29.631065 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8dd3d3608fe9c86b0f65904ec2353df4","Type":"ContainerDied","Data":"acfd1ce3103a5fa37df0519179075aa6fc37e5626660c372ef45915da61d7d3c"} Mar 19 12:03:30.160007 master-0 kubenswrapper[17644]: I0319 12:03:30.159946 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-retry-1-master-0" Mar 19 12:03:30.310682 master-0 kubenswrapper[17644]: I0319 12:03:30.310627 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8980acc0-b2b5-4e44-9b8e-f7086f5a46bb-kubelet-dir\") pod \"8980acc0-b2b5-4e44-9b8e-f7086f5a46bb\" (UID: \"8980acc0-b2b5-4e44-9b8e-f7086f5a46bb\") " Mar 19 12:03:30.310931 master-0 kubenswrapper[17644]: I0319 12:03:30.310870 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8980acc0-b2b5-4e44-9b8e-f7086f5a46bb-kube-api-access\") pod \"8980acc0-b2b5-4e44-9b8e-f7086f5a46bb\" (UID: \"8980acc0-b2b5-4e44-9b8e-f7086f5a46bb\") " Mar 19 12:03:30.310931 master-0 kubenswrapper[17644]: I0319 12:03:30.310897 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8980acc0-b2b5-4e44-9b8e-f7086f5a46bb-var-lock\") pod \"8980acc0-b2b5-4e44-9b8e-f7086f5a46bb\" (UID: \"8980acc0-b2b5-4e44-9b8e-f7086f5a46bb\") " Mar 19 12:03:30.311095 master-0 kubenswrapper[17644]: I0319 12:03:30.311044 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8980acc0-b2b5-4e44-9b8e-f7086f5a46bb-var-lock" (OuterVolumeSpecName: "var-lock") pod "8980acc0-b2b5-4e44-9b8e-f7086f5a46bb" (UID: "8980acc0-b2b5-4e44-9b8e-f7086f5a46bb"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:03:30.311363 master-0 kubenswrapper[17644]: I0319 12:03:30.311335 17644 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8980acc0-b2b5-4e44-9b8e-f7086f5a46bb-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:03:30.311437 master-0 kubenswrapper[17644]: I0319 12:03:30.311378 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8980acc0-b2b5-4e44-9b8e-f7086f5a46bb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8980acc0-b2b5-4e44-9b8e-f7086f5a46bb" (UID: "8980acc0-b2b5-4e44-9b8e-f7086f5a46bb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:03:30.323081 master-0 kubenswrapper[17644]: I0319 12:03:30.323021 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8980acc0-b2b5-4e44-9b8e-f7086f5a46bb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8980acc0-b2b5-4e44-9b8e-f7086f5a46bb" (UID: "8980acc0-b2b5-4e44-9b8e-f7086f5a46bb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:03:30.412413 master-0 kubenswrapper[17644]: I0319 12:03:30.412280 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8980acc0-b2b5-4e44-9b8e-f7086f5a46bb-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 12:03:30.412413 master-0 kubenswrapper[17644]: I0319 12:03:30.412317 17644 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8980acc0-b2b5-4e44-9b8e-f7086f5a46bb-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:03:30.641855 master-0 kubenswrapper[17644]: I0319 12:03:30.641799 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-retry-1-master-0" event={"ID":"8980acc0-b2b5-4e44-9b8e-f7086f5a46bb","Type":"ContainerDied","Data":"baa93e1ac5ad966d619d2678bfcbe0c4d3217e4fc5d349b4bbb3bf791724cdf5"} Mar 19 12:03:30.641855 master-0 kubenswrapper[17644]: I0319 12:03:30.641851 17644 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="baa93e1ac5ad966d619d2678bfcbe0c4d3217e4fc5d349b4bbb3bf791724cdf5" Mar 19 12:03:30.642468 master-0 kubenswrapper[17644]: I0319 12:03:30.641915 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-retry-1-master-0" Mar 19 12:03:30.645952 master-0 kubenswrapper[17644]: I0319 12:03:30.645925 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8dd3d3608fe9c86b0f65904ec2353df4","Type":"ContainerStarted","Data":"614b9a040714905a39e760a6cecb6220bb3a230fd76a3172673a8f3b177db464"} Mar 19 12:03:30.645952 master-0 kubenswrapper[17644]: I0319 12:03:30.645954 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8dd3d3608fe9c86b0f65904ec2353df4","Type":"ContainerStarted","Data":"6d72821ebade3dd7591d917fa0467249fc3edcd5dc0972864a90c11eda293f32"} Mar 19 12:03:30.646070 master-0 kubenswrapper[17644]: I0319 12:03:30.645967 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8dd3d3608fe9c86b0f65904ec2353df4","Type":"ContainerStarted","Data":"629d5b962178c5cc8cd88274e6fb264cf46c233c7dcebb5d39e1e6499d1fc8bf"} Mar 19 12:03:30.646597 master-0 kubenswrapper[17644]: I0319 12:03:30.646144 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:03:30.672263 master-0 kubenswrapper[17644]: I0319 12:03:30.672024 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=2.671993529 podStartE2EDuration="2.671993529s" podCreationTimestamp="2026-03-19 12:03:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:03:30.67043822 +0000 UTC m=+244.440396265" watchObservedRunningTime="2026-03-19 12:03:30.671993529 +0000 UTC m=+244.441951564" Mar 19 12:03:39.228295 master-0 kubenswrapper[17644]: I0319 12:03:39.227095 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 12:03:39.228295 master-0 kubenswrapper[17644]: E0319 12:03:39.227448 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8980acc0-b2b5-4e44-9b8e-f7086f5a46bb" containerName="installer" Mar 19 12:03:39.228295 master-0 kubenswrapper[17644]: I0319 12:03:39.227464 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="8980acc0-b2b5-4e44-9b8e-f7086f5a46bb" containerName="installer" Mar 19 12:03:39.228295 master-0 kubenswrapper[17644]: I0319 12:03:39.227628 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="8980acc0-b2b5-4e44-9b8e-f7086f5a46bb" containerName="installer" Mar 19 12:03:39.233580 master-0 kubenswrapper[17644]: I0319 12:03:39.233000 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.238072 master-0 kubenswrapper[17644]: I0319 12:03:39.238024 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 19 12:03:39.238072 master-0 kubenswrapper[17644]: I0319 12:03:39.238055 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 19 12:03:39.238361 master-0 kubenswrapper[17644]: I0319 12:03:39.238142 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 19 12:03:39.238361 master-0 kubenswrapper[17644]: I0319 12:03:39.238261 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 19 12:03:39.238361 master-0 kubenswrapper[17644]: I0319 12:03:39.238279 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 19 12:03:39.238361 master-0 kubenswrapper[17644]: I0319 12:03:39.238312 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 19 12:03:39.238492 master-0 kubenswrapper[17644]: I0319 12:03:39.238450 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 19 12:03:39.238492 master-0 kubenswrapper[17644]: I0319 12:03:39.238469 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 19 12:03:39.250137 master-0 kubenswrapper[17644]: I0319 12:03:39.249992 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 12:03:39.289779 master-0 kubenswrapper[17644]: I0319 12:03:39.289689 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-config-volume\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.289779 master-0 kubenswrapper[17644]: I0319 12:03:39.289776 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.290470 master-0 kubenswrapper[17644]: I0319 12:03:39.289841 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.290470 master-0 kubenswrapper[17644]: I0319 12:03:39.289894 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.290470 master-0 kubenswrapper[17644]: I0319 12:03:39.289923 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.290470 master-0 kubenswrapper[17644]: I0319 12:03:39.289949 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.290470 master-0 kubenswrapper[17644]: I0319 12:03:39.289996 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.290470 master-0 kubenswrapper[17644]: I0319 12:03:39.290027 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-config-out\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.290470 master-0 kubenswrapper[17644]: I0319 12:03:39.290053 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.290470 master-0 kubenswrapper[17644]: I0319 12:03:39.290105 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d4bm\" (UniqueName: \"kubernetes.io/projected/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-kube-api-access-2d4bm\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.290470 master-0 kubenswrapper[17644]: I0319 12:03:39.290138 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-web-config\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.290470 master-0 kubenswrapper[17644]: I0319 12:03:39.290164 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.393489 master-0 kubenswrapper[17644]: I0319 12:03:39.393411 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.393489 master-0 kubenswrapper[17644]: I0319 12:03:39.393481 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.393832 master-0 kubenswrapper[17644]: I0319 12:03:39.393511 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.393832 master-0 kubenswrapper[17644]: I0319 12:03:39.393555 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.393832 master-0 kubenswrapper[17644]: I0319 12:03:39.393576 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-config-out\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.393832 master-0 kubenswrapper[17644]: I0319 12:03:39.393596 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.393832 master-0 kubenswrapper[17644]: I0319 12:03:39.393618 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d4bm\" (UniqueName: \"kubernetes.io/projected/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-kube-api-access-2d4bm\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.393832 master-0 kubenswrapper[17644]: I0319 12:03:39.393645 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-web-config\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.393832 master-0 kubenswrapper[17644]: I0319 12:03:39.393670 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.393832 master-0 kubenswrapper[17644]: I0319 12:03:39.393784 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-config-volume\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.394052 master-0 kubenswrapper[17644]: I0319 12:03:39.393852 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.394052 master-0 kubenswrapper[17644]: I0319 12:03:39.393917 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.394113 master-0 kubenswrapper[17644]: E0319 12:03:39.394093 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-trusted-ca-bundle podName:8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8 nodeName:}" failed. No retries permitted until 2026-03-19 12:03:39.8940735 +0000 UTC m=+253.664031535 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8") : configmap references non-existent config key: ca-bundle.crt Mar 19 12:03:39.397392 master-0 kubenswrapper[17644]: I0319 12:03:39.397324 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.398097 master-0 kubenswrapper[17644]: I0319 12:03:39.398028 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.399464 master-0 kubenswrapper[17644]: I0319 12:03:39.399415 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.399606 master-0 kubenswrapper[17644]: I0319 12:03:39.399562 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-web-config\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.399928 master-0 kubenswrapper[17644]: I0319 12:03:39.399881 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-config-volume\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.400142 master-0 kubenswrapper[17644]: I0319 12:03:39.400106 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.401089 master-0 kubenswrapper[17644]: I0319 12:03:39.401053 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-config-out\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.403637 master-0 kubenswrapper[17644]: I0319 12:03:39.403584 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.403858 master-0 kubenswrapper[17644]: I0319 12:03:39.403797 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.404362 master-0 kubenswrapper[17644]: I0319 12:03:39.404318 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.410625 master-0 kubenswrapper[17644]: I0319 12:03:39.410575 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d4bm\" (UniqueName: \"kubernetes.io/projected/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-kube-api-access-2d4bm\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.900531 master-0 kubenswrapper[17644]: I0319 12:03:39.900427 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:39.900986 master-0 kubenswrapper[17644]: E0319 12:03:39.900659 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-trusted-ca-bundle podName:8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8 nodeName:}" failed. No retries permitted until 2026-03-19 12:03:40.900639504 +0000 UTC m=+254.670597539 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8") : configmap references non-existent config key: ca-bundle.crt Mar 19 12:03:40.183774 master-0 kubenswrapper[17644]: I0319 12:03:40.183689 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-747db875db-zc5nj"] Mar 19 12:03:40.185814 master-0 kubenswrapper[17644]: I0319 12:03:40.185775 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.188342 master-0 kubenswrapper[17644]: I0319 12:03:40.188299 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 19 12:03:40.188572 master-0 kubenswrapper[17644]: I0319 12:03:40.188546 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-bqm2750ms6tma" Mar 19 12:03:40.188815 master-0 kubenswrapper[17644]: I0319 12:03:40.188720 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 19 12:03:40.188948 master-0 kubenswrapper[17644]: I0319 12:03:40.188929 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 19 12:03:40.191486 master-0 kubenswrapper[17644]: I0319 12:03:40.191450 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 19 12:03:40.191486 master-0 kubenswrapper[17644]: I0319 12:03:40.191472 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 19 12:03:40.200215 master-0 kubenswrapper[17644]: I0319 12:03:40.200164 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-747db875db-zc5nj"] Mar 19 12:03:40.305549 master-0 kubenswrapper[17644]: I0319 12:03:40.305471 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/da6da885-6a82-47bd-a90f-ce81d8e78929-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-747db875db-zc5nj\" (UID: \"da6da885-6a82-47bd-a90f-ce81d8e78929\") " pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.305549 master-0 kubenswrapper[17644]: I0319 12:03:40.305552 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/da6da885-6a82-47bd-a90f-ce81d8e78929-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-747db875db-zc5nj\" (UID: \"da6da885-6a82-47bd-a90f-ce81d8e78929\") " pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.306159 master-0 kubenswrapper[17644]: I0319 12:03:40.305629 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/da6da885-6a82-47bd-a90f-ce81d8e78929-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-747db875db-zc5nj\" (UID: \"da6da885-6a82-47bd-a90f-ce81d8e78929\") " pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.306159 master-0 kubenswrapper[17644]: I0319 12:03:40.305685 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7zhd\" (UniqueName: \"kubernetes.io/projected/da6da885-6a82-47bd-a90f-ce81d8e78929-kube-api-access-k7zhd\") pod \"thanos-querier-747db875db-zc5nj\" (UID: \"da6da885-6a82-47bd-a90f-ce81d8e78929\") " pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.306159 master-0 kubenswrapper[17644]: I0319 12:03:40.305717 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/da6da885-6a82-47bd-a90f-ce81d8e78929-metrics-client-ca\") pod \"thanos-querier-747db875db-zc5nj\" (UID: \"da6da885-6a82-47bd-a90f-ce81d8e78929\") " pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.306159 master-0 kubenswrapper[17644]: I0319 12:03:40.305777 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/da6da885-6a82-47bd-a90f-ce81d8e78929-secret-grpc-tls\") pod \"thanos-querier-747db875db-zc5nj\" (UID: \"da6da885-6a82-47bd-a90f-ce81d8e78929\") " pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.306159 master-0 kubenswrapper[17644]: I0319 12:03:40.305822 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/da6da885-6a82-47bd-a90f-ce81d8e78929-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-747db875db-zc5nj\" (UID: \"da6da885-6a82-47bd-a90f-ce81d8e78929\") " pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.306159 master-0 kubenswrapper[17644]: I0319 12:03:40.305842 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/da6da885-6a82-47bd-a90f-ce81d8e78929-secret-thanos-querier-tls\") pod \"thanos-querier-747db875db-zc5nj\" (UID: \"da6da885-6a82-47bd-a90f-ce81d8e78929\") " pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.406855 master-0 kubenswrapper[17644]: I0319 12:03:40.406804 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/da6da885-6a82-47bd-a90f-ce81d8e78929-secret-grpc-tls\") pod \"thanos-querier-747db875db-zc5nj\" (UID: \"da6da885-6a82-47bd-a90f-ce81d8e78929\") " pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.407240 master-0 kubenswrapper[17644]: I0319 12:03:40.407201 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/da6da885-6a82-47bd-a90f-ce81d8e78929-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-747db875db-zc5nj\" (UID: \"da6da885-6a82-47bd-a90f-ce81d8e78929\") " pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.407670 master-0 kubenswrapper[17644]: I0319 12:03:40.407595 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/da6da885-6a82-47bd-a90f-ce81d8e78929-secret-thanos-querier-tls\") pod \"thanos-querier-747db875db-zc5nj\" (UID: \"da6da885-6a82-47bd-a90f-ce81d8e78929\") " pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.407769 master-0 kubenswrapper[17644]: I0319 12:03:40.407708 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/da6da885-6a82-47bd-a90f-ce81d8e78929-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-747db875db-zc5nj\" (UID: \"da6da885-6a82-47bd-a90f-ce81d8e78929\") " pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.407822 master-0 kubenswrapper[17644]: I0319 12:03:40.407792 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/da6da885-6a82-47bd-a90f-ce81d8e78929-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-747db875db-zc5nj\" (UID: \"da6da885-6a82-47bd-a90f-ce81d8e78929\") " pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.407936 master-0 kubenswrapper[17644]: I0319 12:03:40.407914 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/da6da885-6a82-47bd-a90f-ce81d8e78929-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-747db875db-zc5nj\" (UID: \"da6da885-6a82-47bd-a90f-ce81d8e78929\") " pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.408016 master-0 kubenswrapper[17644]: I0319 12:03:40.407992 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7zhd\" (UniqueName: \"kubernetes.io/projected/da6da885-6a82-47bd-a90f-ce81d8e78929-kube-api-access-k7zhd\") pod \"thanos-querier-747db875db-zc5nj\" (UID: \"da6da885-6a82-47bd-a90f-ce81d8e78929\") " pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.408095 master-0 kubenswrapper[17644]: I0319 12:03:40.408027 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/da6da885-6a82-47bd-a90f-ce81d8e78929-metrics-client-ca\") pod \"thanos-querier-747db875db-zc5nj\" (UID: \"da6da885-6a82-47bd-a90f-ce81d8e78929\") " pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.409127 master-0 kubenswrapper[17644]: I0319 12:03:40.409094 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/da6da885-6a82-47bd-a90f-ce81d8e78929-metrics-client-ca\") pod \"thanos-querier-747db875db-zc5nj\" (UID: \"da6da885-6a82-47bd-a90f-ce81d8e78929\") " pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.413995 master-0 kubenswrapper[17644]: I0319 12:03:40.413953 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/da6da885-6a82-47bd-a90f-ce81d8e78929-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-747db875db-zc5nj\" (UID: \"da6da885-6a82-47bd-a90f-ce81d8e78929\") " pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.414177 master-0 kubenswrapper[17644]: I0319 12:03:40.414149 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/da6da885-6a82-47bd-a90f-ce81d8e78929-secret-thanos-querier-tls\") pod \"thanos-querier-747db875db-zc5nj\" (UID: \"da6da885-6a82-47bd-a90f-ce81d8e78929\") " pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.414318 master-0 kubenswrapper[17644]: I0319 12:03:40.414276 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/da6da885-6a82-47bd-a90f-ce81d8e78929-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-747db875db-zc5nj\" (UID: \"da6da885-6a82-47bd-a90f-ce81d8e78929\") " pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.415191 master-0 kubenswrapper[17644]: I0319 12:03:40.415144 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/da6da885-6a82-47bd-a90f-ce81d8e78929-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-747db875db-zc5nj\" (UID: \"da6da885-6a82-47bd-a90f-ce81d8e78929\") " pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.417175 master-0 kubenswrapper[17644]: I0319 12:03:40.416953 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/da6da885-6a82-47bd-a90f-ce81d8e78929-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-747db875db-zc5nj\" (UID: \"da6da885-6a82-47bd-a90f-ce81d8e78929\") " pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.421419 master-0 kubenswrapper[17644]: I0319 12:03:40.421381 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/da6da885-6a82-47bd-a90f-ce81d8e78929-secret-grpc-tls\") pod \"thanos-querier-747db875db-zc5nj\" (UID: \"da6da885-6a82-47bd-a90f-ce81d8e78929\") " pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.424967 master-0 kubenswrapper[17644]: I0319 12:03:40.424909 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7zhd\" (UniqueName: \"kubernetes.io/projected/da6da885-6a82-47bd-a90f-ce81d8e78929-kube-api-access-k7zhd\") pod \"thanos-querier-747db875db-zc5nj\" (UID: \"da6da885-6a82-47bd-a90f-ce81d8e78929\") " pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.505979 master-0 kubenswrapper[17644]: I0319 12:03:40.505871 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:40.918164 master-0 kubenswrapper[17644]: I0319 12:03:40.918044 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:40.918366 master-0 kubenswrapper[17644]: E0319 12:03:40.918228 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-trusted-ca-bundle podName:8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8 nodeName:}" failed. No retries permitted until 2026-03-19 12:03:42.918206072 +0000 UTC m=+256.688164107 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8") : configmap references non-existent config key: ca-bundle.crt Mar 19 12:03:40.996831 master-0 kubenswrapper[17644]: I0319 12:03:40.995825 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-747db875db-zc5nj"] Mar 19 12:03:41.735949 master-0 kubenswrapper[17644]: I0319 12:03:41.735849 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" event={"ID":"da6da885-6a82-47bd-a90f-ce81d8e78929","Type":"ContainerStarted","Data":"1cb1b5130290022c2878cf7b210b918d8e14a83a3d75870924c650529706a963"} Mar 19 12:03:42.967541 master-0 kubenswrapper[17644]: I0319 12:03:42.967450 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:42.968219 master-0 kubenswrapper[17644]: E0319 12:03:42.967693 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-trusted-ca-bundle podName:8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8 nodeName:}" failed. No retries permitted until 2026-03-19 12:03:46.967670132 +0000 UTC m=+260.737628157 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8") : configmap references non-existent config key: ca-bundle.crt Mar 19 12:03:42.973046 master-0 kubenswrapper[17644]: I0319 12:03:42.972975 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7f479f8754-7s22b"] Mar 19 12:03:42.973985 master-0 kubenswrapper[17644]: I0319 12:03:42.973936 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:03:42.979372 master-0 kubenswrapper[17644]: I0319 12:03:42.979219 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-5j36i3nc1dj99" Mar 19 12:03:42.981144 master-0 kubenswrapper[17644]: I0319 12:03:42.981074 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5"] Mar 19 12:03:42.992959 master-0 kubenswrapper[17644]: I0319 12:03:42.992902 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7f479f8754-7s22b"] Mar 19 12:03:42.993103 master-0 kubenswrapper[17644]: I0319 12:03:42.992987 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" podUID="5f8c022c-7871-4765-971f-dcafa39357c9" containerName="metrics-server" containerID="cri-o://72a73422baa1bf839575e34cbe90d73e29ac03ab1786e2499f59601d503649f6" gracePeriod=170 Mar 19 12:03:43.068880 master-0 kubenswrapper[17644]: I0319 12:03:43.068802 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33f7a977-41a3-4668-9cc4-1330f87bdd29-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7f479f8754-7s22b\" (UID: \"33f7a977-41a3-4668-9cc4-1330f87bdd29\") " pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:03:43.069237 master-0 kubenswrapper[17644]: I0319 12:03:43.068936 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/33f7a977-41a3-4668-9cc4-1330f87bdd29-audit-log\") pod \"metrics-server-7f479f8754-7s22b\" (UID: \"33f7a977-41a3-4668-9cc4-1330f87bdd29\") " pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:03:43.069237 master-0 kubenswrapper[17644]: I0319 12:03:43.069019 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33f7a977-41a3-4668-9cc4-1330f87bdd29-client-ca-bundle\") pod \"metrics-server-7f479f8754-7s22b\" (UID: \"33f7a977-41a3-4668-9cc4-1330f87bdd29\") " pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:03:43.069237 master-0 kubenswrapper[17644]: I0319 12:03:43.069126 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/33f7a977-41a3-4668-9cc4-1330f87bdd29-secret-metrics-client-certs\") pod \"metrics-server-7f479f8754-7s22b\" (UID: \"33f7a977-41a3-4668-9cc4-1330f87bdd29\") " pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:03:43.069237 master-0 kubenswrapper[17644]: I0319 12:03:43.069153 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w7r4\" (UniqueName: \"kubernetes.io/projected/33f7a977-41a3-4668-9cc4-1330f87bdd29-kube-api-access-8w7r4\") pod \"metrics-server-7f479f8754-7s22b\" (UID: \"33f7a977-41a3-4668-9cc4-1330f87bdd29\") " pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:03:43.069237 master-0 kubenswrapper[17644]: I0319 12:03:43.069196 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/33f7a977-41a3-4668-9cc4-1330f87bdd29-secret-metrics-server-tls\") pod \"metrics-server-7f479f8754-7s22b\" (UID: \"33f7a977-41a3-4668-9cc4-1330f87bdd29\") " pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:03:43.069237 master-0 kubenswrapper[17644]: I0319 12:03:43.069216 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/33f7a977-41a3-4668-9cc4-1330f87bdd29-metrics-server-audit-profiles\") pod \"metrics-server-7f479f8754-7s22b\" (UID: \"33f7a977-41a3-4668-9cc4-1330f87bdd29\") " pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:03:43.170578 master-0 kubenswrapper[17644]: I0319 12:03:43.170501 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/33f7a977-41a3-4668-9cc4-1330f87bdd29-secret-metrics-client-certs\") pod \"metrics-server-7f479f8754-7s22b\" (UID: \"33f7a977-41a3-4668-9cc4-1330f87bdd29\") " pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:03:43.170578 master-0 kubenswrapper[17644]: I0319 12:03:43.170557 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w7r4\" (UniqueName: \"kubernetes.io/projected/33f7a977-41a3-4668-9cc4-1330f87bdd29-kube-api-access-8w7r4\") pod \"metrics-server-7f479f8754-7s22b\" (UID: \"33f7a977-41a3-4668-9cc4-1330f87bdd29\") " pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:03:43.170578 master-0 kubenswrapper[17644]: I0319 12:03:43.170591 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/33f7a977-41a3-4668-9cc4-1330f87bdd29-secret-metrics-server-tls\") pod \"metrics-server-7f479f8754-7s22b\" (UID: \"33f7a977-41a3-4668-9cc4-1330f87bdd29\") " pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:03:43.171098 master-0 kubenswrapper[17644]: I0319 12:03:43.170614 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/33f7a977-41a3-4668-9cc4-1330f87bdd29-metrics-server-audit-profiles\") pod \"metrics-server-7f479f8754-7s22b\" (UID: \"33f7a977-41a3-4668-9cc4-1330f87bdd29\") " pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:03:43.171098 master-0 kubenswrapper[17644]: I0319 12:03:43.171078 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33f7a977-41a3-4668-9cc4-1330f87bdd29-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7f479f8754-7s22b\" (UID: \"33f7a977-41a3-4668-9cc4-1330f87bdd29\") " pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:03:43.172383 master-0 kubenswrapper[17644]: I0319 12:03:43.172352 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33f7a977-41a3-4668-9cc4-1330f87bdd29-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7f479f8754-7s22b\" (UID: \"33f7a977-41a3-4668-9cc4-1330f87bdd29\") " pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:03:43.172444 master-0 kubenswrapper[17644]: I0319 12:03:43.172404 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/33f7a977-41a3-4668-9cc4-1330f87bdd29-audit-log\") pod \"metrics-server-7f479f8754-7s22b\" (UID: \"33f7a977-41a3-4668-9cc4-1330f87bdd29\") " pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:03:43.172444 master-0 kubenswrapper[17644]: I0319 12:03:43.172429 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33f7a977-41a3-4668-9cc4-1330f87bdd29-client-ca-bundle\") pod \"metrics-server-7f479f8754-7s22b\" (UID: \"33f7a977-41a3-4668-9cc4-1330f87bdd29\") " pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:03:43.172579 master-0 kubenswrapper[17644]: I0319 12:03:43.172555 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/33f7a977-41a3-4668-9cc4-1330f87bdd29-metrics-server-audit-profiles\") pod \"metrics-server-7f479f8754-7s22b\" (UID: \"33f7a977-41a3-4668-9cc4-1330f87bdd29\") " pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:03:43.173033 master-0 kubenswrapper[17644]: I0319 12:03:43.172991 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/33f7a977-41a3-4668-9cc4-1330f87bdd29-audit-log\") pod \"metrics-server-7f479f8754-7s22b\" (UID: \"33f7a977-41a3-4668-9cc4-1330f87bdd29\") " pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:03:43.174618 master-0 kubenswrapper[17644]: I0319 12:03:43.174570 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/33f7a977-41a3-4668-9cc4-1330f87bdd29-secret-metrics-client-certs\") pod \"metrics-server-7f479f8754-7s22b\" (UID: \"33f7a977-41a3-4668-9cc4-1330f87bdd29\") " pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:03:43.175133 master-0 kubenswrapper[17644]: I0319 12:03:43.175072 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/33f7a977-41a3-4668-9cc4-1330f87bdd29-secret-metrics-server-tls\") pod \"metrics-server-7f479f8754-7s22b\" (UID: \"33f7a977-41a3-4668-9cc4-1330f87bdd29\") " pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:03:43.187759 master-0 kubenswrapper[17644]: I0319 12:03:43.187686 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33f7a977-41a3-4668-9cc4-1330f87bdd29-client-ca-bundle\") pod \"metrics-server-7f479f8754-7s22b\" (UID: \"33f7a977-41a3-4668-9cc4-1330f87bdd29\") " pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:03:43.191816 master-0 kubenswrapper[17644]: I0319 12:03:43.191777 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w7r4\" (UniqueName: \"kubernetes.io/projected/33f7a977-41a3-4668-9cc4-1330f87bdd29-kube-api-access-8w7r4\") pod \"metrics-server-7f479f8754-7s22b\" (UID: \"33f7a977-41a3-4668-9cc4-1330f87bdd29\") " pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:03:43.304530 master-0 kubenswrapper[17644]: I0319 12:03:43.304311 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:03:43.755584 master-0 kubenswrapper[17644]: I0319 12:03:43.751754 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" event={"ID":"da6da885-6a82-47bd-a90f-ce81d8e78929","Type":"ContainerStarted","Data":"1e606a4c483b3a670af69a3ffc4f9712b3bc8b943f0b0618b2ff130d15b48cb3"} Mar 19 12:03:43.755584 master-0 kubenswrapper[17644]: I0319 12:03:43.751805 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" event={"ID":"da6da885-6a82-47bd-a90f-ce81d8e78929","Type":"ContainerStarted","Data":"d87f4e29b83962daba3d9f99b2253d658dffa9957fc58adb487084ba4e9fa13d"} Mar 19 12:03:43.793111 master-0 kubenswrapper[17644]: I0319 12:03:43.792981 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7f479f8754-7s22b"] Mar 19 12:03:43.798104 master-0 kubenswrapper[17644]: W0319 12:03:43.798058 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33f7a977_41a3_4668_9cc4_1330f87bdd29.slice/crio-a25e71a0e41fea35ae1afb39beb57124d3213414f88e60a71d6ce79282f07674 WatchSource:0}: Error finding container a25e71a0e41fea35ae1afb39beb57124d3213414f88e60a71d6ce79282f07674: Status 404 returned error can't find the container with id a25e71a0e41fea35ae1afb39beb57124d3213414f88e60a71d6ce79282f07674 Mar 19 12:03:43.875650 master-0 kubenswrapper[17644]: I0319 12:03:43.875572 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-69dbdb4674-llwxn"] Mar 19 12:03:43.876752 master-0 kubenswrapper[17644]: I0319 12:03:43.876683 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-69dbdb4674-llwxn" Mar 19 12:03:43.892960 master-0 kubenswrapper[17644]: I0319 12:03:43.892898 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/monitoring-plugin-8688f6945-trnd5"] Mar 19 12:03:43.893199 master-0 kubenswrapper[17644]: I0319 12:03:43.893115 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/monitoring-plugin-8688f6945-trnd5" podUID="5f747c54-7f5b-4ec9-a16d-7cb13e511f98" containerName="monitoring-plugin" containerID="cri-o://1e277bdf4bbacbc4c8a7951052568a9d5cb0b455d4403de87fe314d95e164a90" gracePeriod=30 Mar 19 12:03:43.897678 master-0 kubenswrapper[17644]: I0319 12:03:43.897617 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-69dbdb4674-llwxn"] Mar 19 12:03:43.991383 master-0 kubenswrapper[17644]: I0319 12:03:43.991325 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ef4b0a53-dd65-40cf-adca-8ec46a55d28a-monitoring-plugin-cert\") pod \"monitoring-plugin-69dbdb4674-llwxn\" (UID: \"ef4b0a53-dd65-40cf-adca-8ec46a55d28a\") " pod="openshift-monitoring/monitoring-plugin-69dbdb4674-llwxn" Mar 19 12:03:44.096859 master-0 kubenswrapper[17644]: I0319 12:03:44.096701 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ef4b0a53-dd65-40cf-adca-8ec46a55d28a-monitoring-plugin-cert\") pod \"monitoring-plugin-69dbdb4674-llwxn\" (UID: \"ef4b0a53-dd65-40cf-adca-8ec46a55d28a\") " pod="openshift-monitoring/monitoring-plugin-69dbdb4674-llwxn" Mar 19 12:03:44.107424 master-0 kubenswrapper[17644]: I0319 12:03:44.107308 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ef4b0a53-dd65-40cf-adca-8ec46a55d28a-monitoring-plugin-cert\") pod \"monitoring-plugin-69dbdb4674-llwxn\" (UID: \"ef4b0a53-dd65-40cf-adca-8ec46a55d28a\") " pod="openshift-monitoring/monitoring-plugin-69dbdb4674-llwxn" Mar 19 12:03:44.296927 master-0 kubenswrapper[17644]: I0319 12:03:44.296676 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-69dbdb4674-llwxn" Mar 19 12:03:44.400813 master-0 kubenswrapper[17644]: I0319 12:03:44.399949 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-8688f6945-trnd5_5f747c54-7f5b-4ec9-a16d-7cb13e511f98/monitoring-plugin/0.log" Mar 19 12:03:44.400813 master-0 kubenswrapper[17644]: I0319 12:03:44.400023 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-8688f6945-trnd5" Mar 19 12:03:44.509289 master-0 kubenswrapper[17644]: I0319 12:03:44.507383 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5f747c54-7f5b-4ec9-a16d-7cb13e511f98-monitoring-plugin-cert\") pod \"5f747c54-7f5b-4ec9-a16d-7cb13e511f98\" (UID: \"5f747c54-7f5b-4ec9-a16d-7cb13e511f98\") " Mar 19 12:03:44.511465 master-0 kubenswrapper[17644]: I0319 12:03:44.511205 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f747c54-7f5b-4ec9-a16d-7cb13e511f98-monitoring-plugin-cert" (OuterVolumeSpecName: "monitoring-plugin-cert") pod "5f747c54-7f5b-4ec9-a16d-7cb13e511f98" (UID: "5f747c54-7f5b-4ec9-a16d-7cb13e511f98"). InnerVolumeSpecName "monitoring-plugin-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:03:44.586856 master-0 kubenswrapper[17644]: I0319 12:03:44.584153 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 12:03:44.586856 master-0 kubenswrapper[17644]: E0319 12:03:44.584587 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f747c54-7f5b-4ec9-a16d-7cb13e511f98" containerName="monitoring-plugin" Mar 19 12:03:44.586856 master-0 kubenswrapper[17644]: I0319 12:03:44.584609 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f747c54-7f5b-4ec9-a16d-7cb13e511f98" containerName="monitoring-plugin" Mar 19 12:03:44.586856 master-0 kubenswrapper[17644]: I0319 12:03:44.584776 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f747c54-7f5b-4ec9-a16d-7cb13e511f98" containerName="monitoring-plugin" Mar 19 12:03:44.600344 master-0 kubenswrapper[17644]: I0319 12:03:44.600268 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.611203 master-0 kubenswrapper[17644]: I0319 12:03:44.611042 17644 reconciler_common.go:293] "Volume detached for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5f747c54-7f5b-4ec9-a16d-7cb13e511f98-monitoring-plugin-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:03:44.628543 master-0 kubenswrapper[17644]: I0319 12:03:44.628480 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 19 12:03:44.629692 master-0 kubenswrapper[17644]: I0319 12:03:44.629601 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 19 12:03:44.630111 master-0 kubenswrapper[17644]: I0319 12:03:44.630061 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 19 12:03:44.630180 master-0 kubenswrapper[17644]: I0319 12:03:44.630152 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 19 12:03:44.630274 master-0 kubenswrapper[17644]: I0319 12:03:44.630230 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 19 12:03:44.630551 master-0 kubenswrapper[17644]: I0319 12:03:44.630523 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 19 12:03:44.631176 master-0 kubenswrapper[17644]: I0319 12:03:44.630790 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-aprstf5fs6eqr" Mar 19 12:03:44.631176 master-0 kubenswrapper[17644]: I0319 12:03:44.630985 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 19 12:03:44.631176 master-0 kubenswrapper[17644]: I0319 12:03:44.631136 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 19 12:03:44.631310 master-0 kubenswrapper[17644]: I0319 12:03:44.631298 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 19 12:03:44.632049 master-0 kubenswrapper[17644]: I0319 12:03:44.632017 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 19 12:03:44.646448 master-0 kubenswrapper[17644]: I0319 12:03:44.644054 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 19 12:03:44.657769 master-0 kubenswrapper[17644]: I0319 12:03:44.657281 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 12:03:44.722754 master-0 kubenswrapper[17644]: I0319 12:03:44.721381 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.722754 master-0 kubenswrapper[17644]: I0319 12:03:44.721431 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.722754 master-0 kubenswrapper[17644]: I0319 12:03:44.721481 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzwsl\" (UniqueName: \"kubernetes.io/projected/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-kube-api-access-gzwsl\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.722754 master-0 kubenswrapper[17644]: I0319 12:03:44.721500 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.722754 master-0 kubenswrapper[17644]: I0319 12:03:44.721523 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.722754 master-0 kubenswrapper[17644]: I0319 12:03:44.721550 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.722754 master-0 kubenswrapper[17644]: I0319 12:03:44.721612 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-config\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.722754 master-0 kubenswrapper[17644]: I0319 12:03:44.721633 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.722754 master-0 kubenswrapper[17644]: I0319 12:03:44.721650 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.722754 master-0 kubenswrapper[17644]: I0319 12:03:44.721669 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.722754 master-0 kubenswrapper[17644]: I0319 12:03:44.721705 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-web-config\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.722754 master-0 kubenswrapper[17644]: I0319 12:03:44.721723 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.722754 master-0 kubenswrapper[17644]: I0319 12:03:44.721780 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.722754 master-0 kubenswrapper[17644]: I0319 12:03:44.721798 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.722754 master-0 kubenswrapper[17644]: I0319 12:03:44.721947 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-config-out\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.722754 master-0 kubenswrapper[17644]: I0319 12:03:44.722000 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.722754 master-0 kubenswrapper[17644]: I0319 12:03:44.722046 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.722754 master-0 kubenswrapper[17644]: I0319 12:03:44.722086 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.792508 master-0 kubenswrapper[17644]: I0319 12:03:44.790209 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" event={"ID":"da6da885-6a82-47bd-a90f-ce81d8e78929","Type":"ContainerStarted","Data":"b7e0b4d54ee3b475ee505a0c89dc98bdad136cdcd82c9621351fcd8cce48f1b9"} Mar 19 12:03:44.794824 master-0 kubenswrapper[17644]: I0319 12:03:44.793253 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" event={"ID":"33f7a977-41a3-4668-9cc4-1330f87bdd29","Type":"ContainerStarted","Data":"0a738538b983b4ad088d5fa408371dd4505ee50135fe0881656482253398215a"} Mar 19 12:03:44.794824 master-0 kubenswrapper[17644]: I0319 12:03:44.793312 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" event={"ID":"33f7a977-41a3-4668-9cc4-1330f87bdd29","Type":"ContainerStarted","Data":"a25e71a0e41fea35ae1afb39beb57124d3213414f88e60a71d6ce79282f07674"} Mar 19 12:03:44.797103 master-0 kubenswrapper[17644]: I0319 12:03:44.796188 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-8688f6945-trnd5_5f747c54-7f5b-4ec9-a16d-7cb13e511f98/monitoring-plugin/0.log" Mar 19 12:03:44.797103 master-0 kubenswrapper[17644]: I0319 12:03:44.796253 17644 generic.go:334] "Generic (PLEG): container finished" podID="5f747c54-7f5b-4ec9-a16d-7cb13e511f98" containerID="1e277bdf4bbacbc4c8a7951052568a9d5cb0b455d4403de87fe314d95e164a90" exitCode=2 Mar 19 12:03:44.797103 master-0 kubenswrapper[17644]: I0319 12:03:44.796295 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-8688f6945-trnd5" event={"ID":"5f747c54-7f5b-4ec9-a16d-7cb13e511f98","Type":"ContainerDied","Data":"1e277bdf4bbacbc4c8a7951052568a9d5cb0b455d4403de87fe314d95e164a90"} Mar 19 12:03:44.797103 master-0 kubenswrapper[17644]: I0319 12:03:44.796325 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-8688f6945-trnd5" event={"ID":"5f747c54-7f5b-4ec9-a16d-7cb13e511f98","Type":"ContainerDied","Data":"e9cbbce21e78d2f3fa5dc18e35f678f40a16f326436d7fc06c81b7d9ddbe9ddb"} Mar 19 12:03:44.797103 master-0 kubenswrapper[17644]: I0319 12:03:44.796347 17644 scope.go:117] "RemoveContainer" containerID="1e277bdf4bbacbc4c8a7951052568a9d5cb0b455d4403de87fe314d95e164a90" Mar 19 12:03:44.797590 master-0 kubenswrapper[17644]: I0319 12:03:44.797558 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-8688f6945-trnd5" Mar 19 12:03:44.827161 master-0 kubenswrapper[17644]: I0319 12:03:44.827085 17644 scope.go:117] "RemoveContainer" containerID="1e277bdf4bbacbc4c8a7951052568a9d5cb0b455d4403de87fe314d95e164a90" Mar 19 12:03:44.827597 master-0 kubenswrapper[17644]: E0319 12:03:44.827552 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e277bdf4bbacbc4c8a7951052568a9d5cb0b455d4403de87fe314d95e164a90\": container with ID starting with 1e277bdf4bbacbc4c8a7951052568a9d5cb0b455d4403de87fe314d95e164a90 not found: ID does not exist" containerID="1e277bdf4bbacbc4c8a7951052568a9d5cb0b455d4403de87fe314d95e164a90" Mar 19 12:03:44.827661 master-0 kubenswrapper[17644]: I0319 12:03:44.827602 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e277bdf4bbacbc4c8a7951052568a9d5cb0b455d4403de87fe314d95e164a90"} err="failed to get container status \"1e277bdf4bbacbc4c8a7951052568a9d5cb0b455d4403de87fe314d95e164a90\": rpc error: code = NotFound desc = could not find container \"1e277bdf4bbacbc4c8a7951052568a9d5cb0b455d4403de87fe314d95e164a90\": container with ID starting with 1e277bdf4bbacbc4c8a7951052568a9d5cb0b455d4403de87fe314d95e164a90 not found: ID does not exist" Mar 19 12:03:44.828448 master-0 kubenswrapper[17644]: I0319 12:03:44.828382 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.828536 master-0 kubenswrapper[17644]: I0319 12:03:44.828458 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.828536 master-0 kubenswrapper[17644]: I0319 12:03:44.828515 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-web-config\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.828633 master-0 kubenswrapper[17644]: I0319 12:03:44.828543 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.828633 master-0 kubenswrapper[17644]: I0319 12:03:44.828591 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.829112 master-0 kubenswrapper[17644]: I0319 12:03:44.829073 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.829219 master-0 kubenswrapper[17644]: I0319 12:03:44.829182 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.832557 master-0 kubenswrapper[17644]: I0319 12:03:44.829304 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-config-out\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.832557 master-0 kubenswrapper[17644]: I0319 12:03:44.829874 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.832557 master-0 kubenswrapper[17644]: I0319 12:03:44.829978 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.832557 master-0 kubenswrapper[17644]: I0319 12:03:44.830050 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.832557 master-0 kubenswrapper[17644]: I0319 12:03:44.830128 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.832557 master-0 kubenswrapper[17644]: I0319 12:03:44.830157 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.832557 master-0 kubenswrapper[17644]: I0319 12:03:44.830242 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzwsl\" (UniqueName: \"kubernetes.io/projected/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-kube-api-access-gzwsl\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.832557 master-0 kubenswrapper[17644]: I0319 12:03:44.830267 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.832557 master-0 kubenswrapper[17644]: I0319 12:03:44.830312 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.832557 master-0 kubenswrapper[17644]: I0319 12:03:44.830346 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.832557 master-0 kubenswrapper[17644]: I0319 12:03:44.830385 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-config\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.832557 master-0 kubenswrapper[17644]: I0319 12:03:44.830425 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.832557 master-0 kubenswrapper[17644]: I0319 12:03:44.830914 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.832557 master-0 kubenswrapper[17644]: I0319 12:03:44.831828 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.832557 master-0 kubenswrapper[17644]: I0319 12:03:44.831950 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.834803 master-0 kubenswrapper[17644]: I0319 12:03:44.834565 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.835468 master-0 kubenswrapper[17644]: I0319 12:03:44.835029 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-web-config\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.835983 master-0 kubenswrapper[17644]: I0319 12:03:44.835943 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.836118 master-0 kubenswrapper[17644]: E0319 12:03:44.836090 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-trusted-ca-bundle podName:5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e nodeName:}" failed. No retries permitted until 2026-03-19 12:03:45.336067485 +0000 UTC m=+259.106025520 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e") : configmap references non-existent config key: ca-bundle.crt Mar 19 12:03:44.837821 master-0 kubenswrapper[17644]: I0319 12:03:44.837669 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" podStartSLOduration=2.8376312439999998 podStartE2EDuration="2.837631244s" podCreationTimestamp="2026-03-19 12:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:03:44.820257044 +0000 UTC m=+258.590215109" watchObservedRunningTime="2026-03-19 12:03:44.837631244 +0000 UTC m=+258.607589279" Mar 19 12:03:44.839266 master-0 kubenswrapper[17644]: I0319 12:03:44.837775 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.839758 master-0 kubenswrapper[17644]: I0319 12:03:44.839701 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-config-out\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.840400 master-0 kubenswrapper[17644]: I0319 12:03:44.840355 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.855583 master-0 kubenswrapper[17644]: I0319 12:03:44.844061 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.855583 master-0 kubenswrapper[17644]: I0319 12:03:44.847324 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-config\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.855583 master-0 kubenswrapper[17644]: I0319 12:03:44.847341 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.855583 master-0 kubenswrapper[17644]: I0319 12:03:44.849573 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.855583 master-0 kubenswrapper[17644]: I0319 12:03:44.849804 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.855583 master-0 kubenswrapper[17644]: I0319 12:03:44.854810 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.858613 master-0 kubenswrapper[17644]: I0319 12:03:44.858551 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-69dbdb4674-llwxn"] Mar 19 12:03:44.870967 master-0 kubenswrapper[17644]: I0319 12:03:44.870586 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzwsl\" (UniqueName: \"kubernetes.io/projected/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-kube-api-access-gzwsl\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:44.875584 master-0 kubenswrapper[17644]: I0319 12:03:44.875493 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/monitoring-plugin-8688f6945-trnd5"] Mar 19 12:03:44.881686 master-0 kubenswrapper[17644]: I0319 12:03:44.881570 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/monitoring-plugin-8688f6945-trnd5"] Mar 19 12:03:45.338391 master-0 kubenswrapper[17644]: I0319 12:03:45.338301 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:45.339100 master-0 kubenswrapper[17644]: E0319 12:03:45.338480 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-trusted-ca-bundle podName:5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e nodeName:}" failed. No retries permitted until 2026-03-19 12:03:46.338463586 +0000 UTC m=+260.108421621 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e") : configmap references non-existent config key: ca-bundle.crt Mar 19 12:03:45.804074 master-0 kubenswrapper[17644]: I0319 12:03:45.803952 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-69dbdb4674-llwxn" event={"ID":"ef4b0a53-dd65-40cf-adca-8ec46a55d28a","Type":"ContainerStarted","Data":"69b550f57d9116d55e8de526bd05db9d32d8ae18b47109523128c01080c9683c"} Mar 19 12:03:45.804074 master-0 kubenswrapper[17644]: I0319 12:03:45.804008 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-69dbdb4674-llwxn" event={"ID":"ef4b0a53-dd65-40cf-adca-8ec46a55d28a","Type":"ContainerStarted","Data":"c6e9d533eccbef7c8024054212cba42db93611576a03c19a13776bc8fb698e3c"} Mar 19 12:03:45.804074 master-0 kubenswrapper[17644]: I0319 12:03:45.804026 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-69dbdb4674-llwxn" Mar 19 12:03:45.809310 master-0 kubenswrapper[17644]: I0319 12:03:45.809277 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-69dbdb4674-llwxn" Mar 19 12:03:45.846970 master-0 kubenswrapper[17644]: I0319 12:03:45.846884 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-69dbdb4674-llwxn" podStartSLOduration=2.846864135 podStartE2EDuration="2.846864135s" podCreationTimestamp="2026-03-19 12:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:03:45.824361907 +0000 UTC m=+259.594319962" watchObservedRunningTime="2026-03-19 12:03:45.846864135 +0000 UTC m=+259.616822170" Mar 19 12:03:46.358648 master-0 kubenswrapper[17644]: I0319 12:03:46.358553 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:46.359436 master-0 kubenswrapper[17644]: E0319 12:03:46.358849 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-trusted-ca-bundle podName:5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e nodeName:}" failed. No retries permitted until 2026-03-19 12:03:48.358802482 +0000 UTC m=+262.128760687 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e") : configmap references non-existent config key: ca-bundle.crt Mar 19 12:03:46.499153 master-0 kubenswrapper[17644]: I0319 12:03:46.499048 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f747c54-7f5b-4ec9-a16d-7cb13e511f98" path="/var/lib/kubelet/pods/5f747c54-7f5b-4ec9-a16d-7cb13e511f98/volumes" Mar 19 12:03:46.971562 master-0 kubenswrapper[17644]: I0319 12:03:46.971470 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:46.971672 master-0 kubenswrapper[17644]: E0319 12:03:46.971645 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-trusted-ca-bundle podName:8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8 nodeName:}" failed. No retries permitted until 2026-03-19 12:03:54.971618799 +0000 UTC m=+268.741576934 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8") : configmap references non-existent config key: ca-bundle.crt Mar 19 12:03:47.604661 master-0 kubenswrapper[17644]: I0319 12:03:47.604521 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-b2dmc"] Mar 19 12:03:47.606089 master-0 kubenswrapper[17644]: I0319 12:03:47.606052 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" Mar 19 12:03:47.607994 master-0 kubenswrapper[17644]: I0319 12:03:47.607952 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-w49qc" Mar 19 12:03:47.608257 master-0 kubenswrapper[17644]: I0319 12:03:47.608234 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 19 12:03:47.788121 master-0 kubenswrapper[17644]: I0319 12:03:47.788019 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/69264375-c63b-4be0-80b9-52aefeca1382-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-b2dmc\" (UID: \"69264375-c63b-4be0-80b9-52aefeca1382\") " pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" Mar 19 12:03:47.788121 master-0 kubenswrapper[17644]: I0319 12:03:47.788135 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/69264375-c63b-4be0-80b9-52aefeca1382-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-b2dmc\" (UID: \"69264375-c63b-4be0-80b9-52aefeca1382\") " pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" Mar 19 12:03:47.788615 master-0 kubenswrapper[17644]: I0319 12:03:47.788172 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/69264375-c63b-4be0-80b9-52aefeca1382-ready\") pod \"cni-sysctl-allowlist-ds-b2dmc\" (UID: \"69264375-c63b-4be0-80b9-52aefeca1382\") " pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" Mar 19 12:03:47.788615 master-0 kubenswrapper[17644]: I0319 12:03:47.788261 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b59np\" (UniqueName: \"kubernetes.io/projected/69264375-c63b-4be0-80b9-52aefeca1382-kube-api-access-b59np\") pod \"cni-sysctl-allowlist-ds-b2dmc\" (UID: \"69264375-c63b-4be0-80b9-52aefeca1382\") " pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" Mar 19 12:03:47.822010 master-0 kubenswrapper[17644]: I0319 12:03:47.821943 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" event={"ID":"da6da885-6a82-47bd-a90f-ce81d8e78929","Type":"ContainerStarted","Data":"2dbc66439508f2741657842351a698607ffd1da04338c34355e1a97ad6c4eeab"} Mar 19 12:03:47.822457 master-0 kubenswrapper[17644]: I0319 12:03:47.822437 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" event={"ID":"da6da885-6a82-47bd-a90f-ce81d8e78929","Type":"ContainerStarted","Data":"411d9944bfcd91044143be93c47a424dde1300c6dc6e2ce2eaf4f2d4107a5a49"} Mar 19 12:03:47.822576 master-0 kubenswrapper[17644]: I0319 12:03:47.822558 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" event={"ID":"da6da885-6a82-47bd-a90f-ce81d8e78929","Type":"ContainerStarted","Data":"d0379cfee2f912cfee18835f1853429461d576a41d3deccc3853a10446e01411"} Mar 19 12:03:47.850155 master-0 kubenswrapper[17644]: I0319 12:03:47.847480 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" podStartSLOduration=2.18554622 podStartE2EDuration="7.847464165s" podCreationTimestamp="2026-03-19 12:03:40 +0000 UTC" firstStartedPulling="2026-03-19 12:03:41.009869493 +0000 UTC m=+254.779827528" lastFinishedPulling="2026-03-19 12:03:46.671787438 +0000 UTC m=+260.441745473" observedRunningTime="2026-03-19 12:03:47.845553737 +0000 UTC m=+261.615511792" watchObservedRunningTime="2026-03-19 12:03:47.847464165 +0000 UTC m=+261.617422200" Mar 19 12:03:47.890227 master-0 kubenswrapper[17644]: I0319 12:03:47.889999 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b59np\" (UniqueName: \"kubernetes.io/projected/69264375-c63b-4be0-80b9-52aefeca1382-kube-api-access-b59np\") pod \"cni-sysctl-allowlist-ds-b2dmc\" (UID: \"69264375-c63b-4be0-80b9-52aefeca1382\") " pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" Mar 19 12:03:47.890227 master-0 kubenswrapper[17644]: I0319 12:03:47.890087 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/69264375-c63b-4be0-80b9-52aefeca1382-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-b2dmc\" (UID: \"69264375-c63b-4be0-80b9-52aefeca1382\") " pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" Mar 19 12:03:47.890227 master-0 kubenswrapper[17644]: I0319 12:03:47.890127 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/69264375-c63b-4be0-80b9-52aefeca1382-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-b2dmc\" (UID: \"69264375-c63b-4be0-80b9-52aefeca1382\") " pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" Mar 19 12:03:47.890227 master-0 kubenswrapper[17644]: I0319 12:03:47.890169 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/69264375-c63b-4be0-80b9-52aefeca1382-ready\") pod \"cni-sysctl-allowlist-ds-b2dmc\" (UID: \"69264375-c63b-4be0-80b9-52aefeca1382\") " pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" Mar 19 12:03:47.890641 master-0 kubenswrapper[17644]: I0319 12:03:47.890428 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/69264375-c63b-4be0-80b9-52aefeca1382-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-b2dmc\" (UID: \"69264375-c63b-4be0-80b9-52aefeca1382\") " pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" Mar 19 12:03:47.890641 master-0 kubenswrapper[17644]: I0319 12:03:47.890572 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/69264375-c63b-4be0-80b9-52aefeca1382-ready\") pod \"cni-sysctl-allowlist-ds-b2dmc\" (UID: \"69264375-c63b-4be0-80b9-52aefeca1382\") " pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" Mar 19 12:03:47.891106 master-0 kubenswrapper[17644]: I0319 12:03:47.891064 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/69264375-c63b-4be0-80b9-52aefeca1382-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-b2dmc\" (UID: \"69264375-c63b-4be0-80b9-52aefeca1382\") " pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" Mar 19 12:03:47.907021 master-0 kubenswrapper[17644]: I0319 12:03:47.906950 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b59np\" (UniqueName: \"kubernetes.io/projected/69264375-c63b-4be0-80b9-52aefeca1382-kube-api-access-b59np\") pod \"cni-sysctl-allowlist-ds-b2dmc\" (UID: \"69264375-c63b-4be0-80b9-52aefeca1382\") " pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" Mar 19 12:03:47.925196 master-0 kubenswrapper[17644]: I0319 12:03:47.925129 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" Mar 19 12:03:47.949103 master-0 kubenswrapper[17644]: W0319 12:03:47.949029 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69264375_c63b_4be0_80b9_52aefeca1382.slice/crio-9f9d47e4dd383fdf4e789661c83b79b4e65e3937e77d461da0de6315684f5f8e WatchSource:0}: Error finding container 9f9d47e4dd383fdf4e789661c83b79b4e65e3937e77d461da0de6315684f5f8e: Status 404 returned error can't find the container with id 9f9d47e4dd383fdf4e789661c83b79b4e65e3937e77d461da0de6315684f5f8e Mar 19 12:03:48.398606 master-0 kubenswrapper[17644]: I0319 12:03:48.398534 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:48.399094 master-0 kubenswrapper[17644]: E0319 12:03:48.398713 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-trusted-ca-bundle podName:5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e nodeName:}" failed. No retries permitted until 2026-03-19 12:03:52.398688155 +0000 UTC m=+266.168646190 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e") : configmap references non-existent config key: ca-bundle.crt Mar 19 12:03:48.836696 master-0 kubenswrapper[17644]: I0319 12:03:48.836622 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" event={"ID":"69264375-c63b-4be0-80b9-52aefeca1382","Type":"ContainerStarted","Data":"02a9c97b78bcee0c2ed5ad248cab444495662a7f271c004b29656d38d9609c20"} Mar 19 12:03:48.837209 master-0 kubenswrapper[17644]: I0319 12:03:48.836704 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" event={"ID":"69264375-c63b-4be0-80b9-52aefeca1382","Type":"ContainerStarted","Data":"9f9d47e4dd383fdf4e789661c83b79b4e65e3937e77d461da0de6315684f5f8e"} Mar 19 12:03:48.837401 master-0 kubenswrapper[17644]: I0319 12:03:48.837339 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" Mar 19 12:03:48.837802 master-0 kubenswrapper[17644]: I0319 12:03:48.837717 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:48.846794 master-0 kubenswrapper[17644]: I0319 12:03:48.846717 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-747db875db-zc5nj" Mar 19 12:03:48.863016 master-0 kubenswrapper[17644]: I0319 12:03:48.862902 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" podStartSLOduration=1.862880159 podStartE2EDuration="1.862880159s" podCreationTimestamp="2026-03-19 12:03:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:03:48.859096825 +0000 UTC m=+262.629054880" watchObservedRunningTime="2026-03-19 12:03:48.862880159 +0000 UTC m=+262.632838204" Mar 19 12:03:49.861180 master-0 kubenswrapper[17644]: I0319 12:03:49.861124 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" Mar 19 12:03:50.603322 master-0 kubenswrapper[17644]: I0319 12:03:50.603236 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-b2dmc"] Mar 19 12:03:51.853828 master-0 kubenswrapper[17644]: I0319 12:03:51.853692 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" podUID="69264375-c63b-4be0-80b9-52aefeca1382" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://02a9c97b78bcee0c2ed5ad248cab444495662a7f271c004b29656d38d9609c20" gracePeriod=30 Mar 19 12:03:52.472256 master-0 kubenswrapper[17644]: I0319 12:03:52.472186 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:52.473637 master-0 kubenswrapper[17644]: I0319 12:03:52.473592 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:52.553815 master-0 kubenswrapper[17644]: I0319 12:03:52.553691 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:03:52.977982 master-0 kubenswrapper[17644]: I0319 12:03:52.977919 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 12:03:52.989341 master-0 kubenswrapper[17644]: W0319 12:03:52.989276 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5aee1f5a_01e6_47b5_91ac_4c77f2b0f54e.slice/crio-5c7857ea1a010de91399569238d9a1c4c44df25ba2ca42c7577db11326546d38 WatchSource:0}: Error finding container 5c7857ea1a010de91399569238d9a1c4c44df25ba2ca42c7577db11326546d38: Status 404 returned error can't find the container with id 5c7857ea1a010de91399569238d9a1c4c44df25ba2ca42c7577db11326546d38 Mar 19 12:03:52.991001 master-0 kubenswrapper[17644]: I0319 12:03:52.990960 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-66b5747dd4-gk5bw"] Mar 19 12:03:52.992275 master-0 kubenswrapper[17644]: I0319 12:03:52.992249 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:52.996465 master-0 kubenswrapper[17644]: I0319 12:03:52.996426 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-dockercfg-rb6b2" Mar 19 12:03:52.996641 master-0 kubenswrapper[17644]: I0319 12:03:52.996620 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 19 12:03:52.996803 master-0 kubenswrapper[17644]: I0319 12:03:52.996764 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 19 12:03:52.996958 master-0 kubenswrapper[17644]: I0319 12:03:52.996933 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 19 12:03:52.997300 master-0 kubenswrapper[17644]: I0319 12:03:52.997278 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 19 12:03:52.997468 master-0 kubenswrapper[17644]: I0319 12:03:52.997443 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 19 12:03:53.008172 master-0 kubenswrapper[17644]: I0319 12:03:53.008099 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Mar 19 12:03:53.037857 master-0 kubenswrapper[17644]: I0319 12:03:53.037789 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-66b5747dd4-gk5bw"] Mar 19 12:03:53.181483 master-0 kubenswrapper[17644]: I0319 12:03:53.181431 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/213436b9-a964-4083-9187-65c82be4bb24-telemeter-client-tls\") pod \"telemeter-client-66b5747dd4-gk5bw\" (UID: \"213436b9-a964-4083-9187-65c82be4bb24\") " pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:53.181723 master-0 kubenswrapper[17644]: I0319 12:03:53.181525 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxvns\" (UniqueName: \"kubernetes.io/projected/213436b9-a964-4083-9187-65c82be4bb24-kube-api-access-rxvns\") pod \"telemeter-client-66b5747dd4-gk5bw\" (UID: \"213436b9-a964-4083-9187-65c82be4bb24\") " pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:53.181723 master-0 kubenswrapper[17644]: I0319 12:03:53.181553 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/213436b9-a964-4083-9187-65c82be4bb24-federate-client-tls\") pod \"telemeter-client-66b5747dd4-gk5bw\" (UID: \"213436b9-a964-4083-9187-65c82be4bb24\") " pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:53.181723 master-0 kubenswrapper[17644]: I0319 12:03:53.181606 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/213436b9-a964-4083-9187-65c82be4bb24-serving-certs-ca-bundle\") pod \"telemeter-client-66b5747dd4-gk5bw\" (UID: \"213436b9-a964-4083-9187-65c82be4bb24\") " pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:53.181723 master-0 kubenswrapper[17644]: I0319 12:03:53.181650 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/213436b9-a964-4083-9187-65c82be4bb24-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-66b5747dd4-gk5bw\" (UID: \"213436b9-a964-4083-9187-65c82be4bb24\") " pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:53.181723 master-0 kubenswrapper[17644]: I0319 12:03:53.181672 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/213436b9-a964-4083-9187-65c82be4bb24-telemeter-trusted-ca-bundle\") pod \"telemeter-client-66b5747dd4-gk5bw\" (UID: \"213436b9-a964-4083-9187-65c82be4bb24\") " pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:53.181723 master-0 kubenswrapper[17644]: I0319 12:03:53.181717 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/213436b9-a964-4083-9187-65c82be4bb24-secret-telemeter-client\") pod \"telemeter-client-66b5747dd4-gk5bw\" (UID: \"213436b9-a964-4083-9187-65c82be4bb24\") " pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:53.181989 master-0 kubenswrapper[17644]: I0319 12:03:53.181760 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/213436b9-a964-4083-9187-65c82be4bb24-metrics-client-ca\") pod \"telemeter-client-66b5747dd4-gk5bw\" (UID: \"213436b9-a964-4083-9187-65c82be4bb24\") " pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:53.283222 master-0 kubenswrapper[17644]: I0319 12:03:53.283091 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/213436b9-a964-4083-9187-65c82be4bb24-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-66b5747dd4-gk5bw\" (UID: \"213436b9-a964-4083-9187-65c82be4bb24\") " pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:53.283222 master-0 kubenswrapper[17644]: I0319 12:03:53.283143 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/213436b9-a964-4083-9187-65c82be4bb24-telemeter-trusted-ca-bundle\") pod \"telemeter-client-66b5747dd4-gk5bw\" (UID: \"213436b9-a964-4083-9187-65c82be4bb24\") " pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:53.284291 master-0 kubenswrapper[17644]: I0319 12:03:53.283264 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/213436b9-a964-4083-9187-65c82be4bb24-secret-telemeter-client\") pod \"telemeter-client-66b5747dd4-gk5bw\" (UID: \"213436b9-a964-4083-9187-65c82be4bb24\") " pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:53.284291 master-0 kubenswrapper[17644]: I0319 12:03:53.283286 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/213436b9-a964-4083-9187-65c82be4bb24-metrics-client-ca\") pod \"telemeter-client-66b5747dd4-gk5bw\" (UID: \"213436b9-a964-4083-9187-65c82be4bb24\") " pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:53.284291 master-0 kubenswrapper[17644]: I0319 12:03:53.283303 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/213436b9-a964-4083-9187-65c82be4bb24-telemeter-client-tls\") pod \"telemeter-client-66b5747dd4-gk5bw\" (UID: \"213436b9-a964-4083-9187-65c82be4bb24\") " pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:53.284291 master-0 kubenswrapper[17644]: I0319 12:03:53.283558 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxvns\" (UniqueName: \"kubernetes.io/projected/213436b9-a964-4083-9187-65c82be4bb24-kube-api-access-rxvns\") pod \"telemeter-client-66b5747dd4-gk5bw\" (UID: \"213436b9-a964-4083-9187-65c82be4bb24\") " pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:53.284291 master-0 kubenswrapper[17644]: I0319 12:03:53.283647 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/213436b9-a964-4083-9187-65c82be4bb24-federate-client-tls\") pod \"telemeter-client-66b5747dd4-gk5bw\" (UID: \"213436b9-a964-4083-9187-65c82be4bb24\") " pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:53.284291 master-0 kubenswrapper[17644]: I0319 12:03:53.283833 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/213436b9-a964-4083-9187-65c82be4bb24-serving-certs-ca-bundle\") pod \"telemeter-client-66b5747dd4-gk5bw\" (UID: \"213436b9-a964-4083-9187-65c82be4bb24\") " pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:53.284525 master-0 kubenswrapper[17644]: I0319 12:03:53.284320 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/213436b9-a964-4083-9187-65c82be4bb24-metrics-client-ca\") pod \"telemeter-client-66b5747dd4-gk5bw\" (UID: \"213436b9-a964-4083-9187-65c82be4bb24\") " pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:53.284644 master-0 kubenswrapper[17644]: I0319 12:03:53.284597 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/213436b9-a964-4083-9187-65c82be4bb24-telemeter-trusted-ca-bundle\") pod \"telemeter-client-66b5747dd4-gk5bw\" (UID: \"213436b9-a964-4083-9187-65c82be4bb24\") " pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:53.284860 master-0 kubenswrapper[17644]: I0319 12:03:53.284831 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/213436b9-a964-4083-9187-65c82be4bb24-serving-certs-ca-bundle\") pod \"telemeter-client-66b5747dd4-gk5bw\" (UID: \"213436b9-a964-4083-9187-65c82be4bb24\") " pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:53.287442 master-0 kubenswrapper[17644]: I0319 12:03:53.287389 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/213436b9-a964-4083-9187-65c82be4bb24-federate-client-tls\") pod \"telemeter-client-66b5747dd4-gk5bw\" (UID: \"213436b9-a964-4083-9187-65c82be4bb24\") " pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:53.287696 master-0 kubenswrapper[17644]: I0319 12:03:53.287647 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/213436b9-a964-4083-9187-65c82be4bb24-secret-telemeter-client\") pod \"telemeter-client-66b5747dd4-gk5bw\" (UID: \"213436b9-a964-4083-9187-65c82be4bb24\") " pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:53.287981 master-0 kubenswrapper[17644]: I0319 12:03:53.287950 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/213436b9-a964-4083-9187-65c82be4bb24-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-66b5747dd4-gk5bw\" (UID: \"213436b9-a964-4083-9187-65c82be4bb24\") " pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:53.289529 master-0 kubenswrapper[17644]: I0319 12:03:53.289493 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/213436b9-a964-4083-9187-65c82be4bb24-telemeter-client-tls\") pod \"telemeter-client-66b5747dd4-gk5bw\" (UID: \"213436b9-a964-4083-9187-65c82be4bb24\") " pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:53.302653 master-0 kubenswrapper[17644]: I0319 12:03:53.302608 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxvns\" (UniqueName: \"kubernetes.io/projected/213436b9-a964-4083-9187-65c82be4bb24-kube-api-access-rxvns\") pod \"telemeter-client-66b5747dd4-gk5bw\" (UID: \"213436b9-a964-4083-9187-65c82be4bb24\") " pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:53.330833 master-0 kubenswrapper[17644]: I0319 12:03:53.330765 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" Mar 19 12:03:53.863333 master-0 kubenswrapper[17644]: I0319 12:03:53.863274 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e","Type":"ContainerStarted","Data":"5c7857ea1a010de91399569238d9a1c4c44df25ba2ca42c7577db11326546d38"} Mar 19 12:03:54.071031 master-0 kubenswrapper[17644]: I0319 12:03:54.070957 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-66b5747dd4-gk5bw"] Mar 19 12:03:54.071966 master-0 kubenswrapper[17644]: W0319 12:03:54.071918 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod213436b9_a964_4083_9187_65c82be4bb24.slice/crio-ce8d51def5fca18ce7d6bdeb9bc377c82e9ce4fda866812aa355c6ba35e06049 WatchSource:0}: Error finding container ce8d51def5fca18ce7d6bdeb9bc377c82e9ce4fda866812aa355c6ba35e06049: Status 404 returned error can't find the container with id ce8d51def5fca18ce7d6bdeb9bc377c82e9ce4fda866812aa355c6ba35e06049 Mar 19 12:03:54.777981 master-0 kubenswrapper[17644]: I0319 12:03:54.777784 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 19 12:03:54.779050 master-0 kubenswrapper[17644]: I0319 12:03:54.778917 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 12:03:54.785086 master-0 kubenswrapper[17644]: I0319 12:03:54.785047 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-g6g26" Mar 19 12:03:54.785509 master-0 kubenswrapper[17644]: I0319 12:03:54.785484 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 12:03:54.792233 master-0 kubenswrapper[17644]: I0319 12:03:54.792164 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 19 12:03:54.872909 master-0 kubenswrapper[17644]: I0319 12:03:54.872829 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" event={"ID":"213436b9-a964-4083-9187-65c82be4bb24","Type":"ContainerStarted","Data":"ce8d51def5fca18ce7d6bdeb9bc377c82e9ce4fda866812aa355c6ba35e06049"} Mar 19 12:03:54.907211 master-0 kubenswrapper[17644]: I0319 12:03:54.907167 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6602edde-61c4-4316-a2ca-a21c764eb590-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"6602edde-61c4-4316-a2ca-a21c764eb590\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 12:03:54.908079 master-0 kubenswrapper[17644]: I0319 12:03:54.908062 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6602edde-61c4-4316-a2ca-a21c764eb590-var-lock\") pod \"installer-2-master-0\" (UID: \"6602edde-61c4-4316-a2ca-a21c764eb590\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 12:03:54.908205 master-0 kubenswrapper[17644]: I0319 12:03:54.908191 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6602edde-61c4-4316-a2ca-a21c764eb590-kube-api-access\") pod \"installer-2-master-0\" (UID: \"6602edde-61c4-4316-a2ca-a21c764eb590\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 12:03:55.010594 master-0 kubenswrapper[17644]: I0319 12:03:55.010540 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6602edde-61c4-4316-a2ca-a21c764eb590-kube-api-access\") pod \"installer-2-master-0\" (UID: \"6602edde-61c4-4316-a2ca-a21c764eb590\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 12:03:55.012473 master-0 kubenswrapper[17644]: I0319 12:03:55.011697 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6602edde-61c4-4316-a2ca-a21c764eb590-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"6602edde-61c4-4316-a2ca-a21c764eb590\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 12:03:55.012473 master-0 kubenswrapper[17644]: I0319 12:03:55.011779 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:55.012473 master-0 kubenswrapper[17644]: I0319 12:03:55.011812 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6602edde-61c4-4316-a2ca-a21c764eb590-var-lock\") pod \"installer-2-master-0\" (UID: \"6602edde-61c4-4316-a2ca-a21c764eb590\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 12:03:55.012473 master-0 kubenswrapper[17644]: I0319 12:03:55.011888 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6602edde-61c4-4316-a2ca-a21c764eb590-var-lock\") pod \"installer-2-master-0\" (UID: \"6602edde-61c4-4316-a2ca-a21c764eb590\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 12:03:55.012473 master-0 kubenswrapper[17644]: I0319 12:03:55.011926 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6602edde-61c4-4316-a2ca-a21c764eb590-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"6602edde-61c4-4316-a2ca-a21c764eb590\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 12:03:55.015516 master-0 kubenswrapper[17644]: I0319 12:03:55.013314 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:55.030017 master-0 kubenswrapper[17644]: I0319 12:03:55.029875 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6602edde-61c4-4316-a2ca-a21c764eb590-kube-api-access\") pod \"installer-2-master-0\" (UID: \"6602edde-61c4-4316-a2ca-a21c764eb590\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 12:03:55.159784 master-0 kubenswrapper[17644]: I0319 12:03:55.159674 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:03:55.263487 master-0 kubenswrapper[17644]: I0319 12:03:55.262763 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 12:03:55.575445 master-0 kubenswrapper[17644]: I0319 12:03:55.574813 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 12:03:55.578331 master-0 kubenswrapper[17644]: W0319 12:03:55.578266 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f4cfb4b_ef6e_40d5_a1c1_fbebbece2dc8.slice/crio-e3653f9b87f745028de768fe39f1b91bd3018e50e0ce6566f3984acf7c35f394 WatchSource:0}: Error finding container e3653f9b87f745028de768fe39f1b91bd3018e50e0ce6566f3984acf7c35f394: Status 404 returned error can't find the container with id e3653f9b87f745028de768fe39f1b91bd3018e50e0ce6566f3984acf7c35f394 Mar 19 12:03:55.700480 master-0 kubenswrapper[17644]: I0319 12:03:55.700385 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 19 12:03:55.704259 master-0 kubenswrapper[17644]: W0319 12:03:55.704164 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6602edde_61c4_4316_a2ca_a21c764eb590.slice/crio-7f8fd2be4ab4ed9016a4ef756d06935635f3093245c8c659f437f84c27edbbec WatchSource:0}: Error finding container 7f8fd2be4ab4ed9016a4ef756d06935635f3093245c8c659f437f84c27edbbec: Status 404 returned error can't find the container with id 7f8fd2be4ab4ed9016a4ef756d06935635f3093245c8c659f437f84c27edbbec Mar 19 12:03:55.893436 master-0 kubenswrapper[17644]: I0319 12:03:55.893383 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"6602edde-61c4-4316-a2ca-a21c764eb590","Type":"ContainerStarted","Data":"7f8fd2be4ab4ed9016a4ef756d06935635f3093245c8c659f437f84c27edbbec"} Mar 19 12:03:55.906703 master-0 kubenswrapper[17644]: I0319 12:03:55.906662 17644 generic.go:334] "Generic (PLEG): container finished" podID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerID="e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71" exitCode=0 Mar 19 12:03:55.906869 master-0 kubenswrapper[17644]: I0319 12:03:55.906800 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e","Type":"ContainerDied","Data":"e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71"} Mar 19 12:03:55.908886 master-0 kubenswrapper[17644]: I0319 12:03:55.908838 17644 generic.go:334] "Generic (PLEG): container finished" podID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerID="b00de993f0e0b8655ba99ede10174e49c870ebb69aeba71e72944bf22b99febf" exitCode=0 Mar 19 12:03:55.908886 master-0 kubenswrapper[17644]: I0319 12:03:55.908877 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8","Type":"ContainerDied","Data":"b00de993f0e0b8655ba99ede10174e49c870ebb69aeba71e72944bf22b99febf"} Mar 19 12:03:55.908998 master-0 kubenswrapper[17644]: I0319 12:03:55.908899 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8","Type":"ContainerStarted","Data":"e3653f9b87f745028de768fe39f1b91bd3018e50e0ce6566f3984acf7c35f394"} Mar 19 12:03:56.921746 master-0 kubenswrapper[17644]: I0319 12:03:56.921613 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"6602edde-61c4-4316-a2ca-a21c764eb590","Type":"ContainerStarted","Data":"47bdda6e9df1906851b44214a81715bce9596f58c9f5ceb95fe1c2e7f3bea6e0"} Mar 19 12:03:56.946913 master-0 kubenswrapper[17644]: I0319 12:03:56.946816 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-2-master-0" podStartSLOduration=2.946793186 podStartE2EDuration="2.946793186s" podCreationTimestamp="2026-03-19 12:03:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:03:56.941457964 +0000 UTC m=+270.711416019" watchObservedRunningTime="2026-03-19 12:03:56.946793186 +0000 UTC m=+270.716751221" Mar 19 12:03:57.318658 master-0 kubenswrapper[17644]: I0319 12:03:57.318603 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-649577484c-phc8b"] Mar 19 12:03:57.322749 master-0 kubenswrapper[17644]: I0319 12:03:57.320164 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-649577484c-phc8b" Mar 19 12:03:57.351784 master-0 kubenswrapper[17644]: I0319 12:03:57.350195 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eb7a7fb9-31c8-4038-81bb-c3f8ca2ec204-webhook-certs\") pod \"multus-admission-controller-649577484c-phc8b\" (UID: \"eb7a7fb9-31c8-4038-81bb-c3f8ca2ec204\") " pod="openshift-multus/multus-admission-controller-649577484c-phc8b" Mar 19 12:03:57.351784 master-0 kubenswrapper[17644]: I0319 12:03:57.350256 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk6ss\" (UniqueName: \"kubernetes.io/projected/eb7a7fb9-31c8-4038-81bb-c3f8ca2ec204-kube-api-access-rk6ss\") pod \"multus-admission-controller-649577484c-phc8b\" (UID: \"eb7a7fb9-31c8-4038-81bb-c3f8ca2ec204\") " pod="openshift-multus/multus-admission-controller-649577484c-phc8b" Mar 19 12:03:57.360757 master-0 kubenswrapper[17644]: I0319 12:03:57.355869 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-649577484c-phc8b"] Mar 19 12:03:57.451286 master-0 kubenswrapper[17644]: I0319 12:03:57.450864 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eb7a7fb9-31c8-4038-81bb-c3f8ca2ec204-webhook-certs\") pod \"multus-admission-controller-649577484c-phc8b\" (UID: \"eb7a7fb9-31c8-4038-81bb-c3f8ca2ec204\") " pod="openshift-multus/multus-admission-controller-649577484c-phc8b" Mar 19 12:03:57.451286 master-0 kubenswrapper[17644]: I0319 12:03:57.450957 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk6ss\" (UniqueName: \"kubernetes.io/projected/eb7a7fb9-31c8-4038-81bb-c3f8ca2ec204-kube-api-access-rk6ss\") pod \"multus-admission-controller-649577484c-phc8b\" (UID: \"eb7a7fb9-31c8-4038-81bb-c3f8ca2ec204\") " pod="openshift-multus/multus-admission-controller-649577484c-phc8b" Mar 19 12:03:57.455722 master-0 kubenswrapper[17644]: I0319 12:03:57.455690 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eb7a7fb9-31c8-4038-81bb-c3f8ca2ec204-webhook-certs\") pod \"multus-admission-controller-649577484c-phc8b\" (UID: \"eb7a7fb9-31c8-4038-81bb-c3f8ca2ec204\") " pod="openshift-multus/multus-admission-controller-649577484c-phc8b" Mar 19 12:03:57.478396 master-0 kubenswrapper[17644]: I0319 12:03:57.478320 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk6ss\" (UniqueName: \"kubernetes.io/projected/eb7a7fb9-31c8-4038-81bb-c3f8ca2ec204-kube-api-access-rk6ss\") pod \"multus-admission-controller-649577484c-phc8b\" (UID: \"eb7a7fb9-31c8-4038-81bb-c3f8ca2ec204\") " pod="openshift-multus/multus-admission-controller-649577484c-phc8b" Mar 19 12:03:57.646380 master-0 kubenswrapper[17644]: I0319 12:03:57.646325 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-649577484c-phc8b" Mar 19 12:03:57.927412 master-0 kubenswrapper[17644]: E0319 12:03:57.927324 17644 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02a9c97b78bcee0c2ed5ad248cab444495662a7f271c004b29656d38d9609c20" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 12:03:57.931246 master-0 kubenswrapper[17644]: E0319 12:03:57.931210 17644 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02a9c97b78bcee0c2ed5ad248cab444495662a7f271c004b29656d38d9609c20" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 12:03:57.932960 master-0 kubenswrapper[17644]: E0319 12:03:57.932642 17644 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02a9c97b78bcee0c2ed5ad248cab444495662a7f271c004b29656d38d9609c20" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 12:03:57.932960 master-0 kubenswrapper[17644]: E0319 12:03:57.932691 17644 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" podUID="69264375-c63b-4be0-80b9-52aefeca1382" containerName="kube-multus-additional-cni-plugins" Mar 19 12:03:57.934523 master-0 kubenswrapper[17644]: I0319 12:03:57.934493 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" event={"ID":"213436b9-a964-4083-9187-65c82be4bb24","Type":"ContainerStarted","Data":"0ec3b3620573a0cfe4604cb49041e8ffcf4bd642c737d21e6d469366b859572a"} Mar 19 12:03:57.934621 master-0 kubenswrapper[17644]: I0319 12:03:57.934529 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" event={"ID":"213436b9-a964-4083-9187-65c82be4bb24","Type":"ContainerStarted","Data":"230b67b5c1154821200b88967f199951844beda9a86e4ecad58fd003e4073336"} Mar 19 12:03:57.934621 master-0 kubenswrapper[17644]: I0319 12:03:57.934544 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" event={"ID":"213436b9-a964-4083-9187-65c82be4bb24","Type":"ContainerStarted","Data":"c559e119845fb6cfdf9cd16f90fb1171c7ec99b36a6a1381236f8eade33ea534"} Mar 19 12:03:58.081278 master-0 kubenswrapper[17644]: I0319 12:03:58.081160 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-66b5747dd4-gk5bw" podStartSLOduration=3.105767072 podStartE2EDuration="6.081137227s" podCreationTimestamp="2026-03-19 12:03:52 +0000 UTC" firstStartedPulling="2026-03-19 12:03:54.074797691 +0000 UTC m=+267.844755726" lastFinishedPulling="2026-03-19 12:03:57.050167856 +0000 UTC m=+270.820125881" observedRunningTime="2026-03-19 12:03:57.967649498 +0000 UTC m=+271.737607543" watchObservedRunningTime="2026-03-19 12:03:58.081137227 +0000 UTC m=+271.851095272" Mar 19 12:03:58.083763 master-0 kubenswrapper[17644]: W0319 12:03:58.083680 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb7a7fb9_31c8_4038_81bb_c3f8ca2ec204.slice/crio-9dfafc0f711366b2d107141dfdf424c9ac62bbecb59814d0d8d93deb028152b3 WatchSource:0}: Error finding container 9dfafc0f711366b2d107141dfdf424c9ac62bbecb59814d0d8d93deb028152b3: Status 404 returned error can't find the container with id 9dfafc0f711366b2d107141dfdf424c9ac62bbecb59814d0d8d93deb028152b3 Mar 19 12:03:58.084579 master-0 kubenswrapper[17644]: I0319 12:03:58.084559 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-649577484c-phc8b"] Mar 19 12:03:58.945827 master-0 kubenswrapper[17644]: I0319 12:03:58.945773 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-649577484c-phc8b" event={"ID":"eb7a7fb9-31c8-4038-81bb-c3f8ca2ec204","Type":"ContainerStarted","Data":"03b522420034852af392eb0a7a69d0929195376e563e5dc639f2f1f9dd8576f3"} Mar 19 12:03:58.946394 master-0 kubenswrapper[17644]: I0319 12:03:58.945861 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-649577484c-phc8b" event={"ID":"eb7a7fb9-31c8-4038-81bb-c3f8ca2ec204","Type":"ContainerStarted","Data":"531566cea9ad5de2a4c94f747e6ae11b8674500587881afe4a1d788afa4af172"} Mar 19 12:03:58.946394 master-0 kubenswrapper[17644]: I0319 12:03:58.945876 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-649577484c-phc8b" event={"ID":"eb7a7fb9-31c8-4038-81bb-c3f8ca2ec204","Type":"ContainerStarted","Data":"9dfafc0f711366b2d107141dfdf424c9ac62bbecb59814d0d8d93deb028152b3"} Mar 19 12:03:59.000749 master-0 kubenswrapper[17644]: I0319 12:03:59.000649 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-649577484c-phc8b" podStartSLOduration=2.000629249 podStartE2EDuration="2.000629249s" podCreationTimestamp="2026-03-19 12:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:03:58.982279546 +0000 UTC m=+272.752237581" watchObservedRunningTime="2026-03-19 12:03:59.000629249 +0000 UTC m=+272.770587284" Mar 19 12:03:59.019300 master-0 kubenswrapper[17644]: I0319 12:03:59.019245 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz"] Mar 19 12:03:59.019599 master-0 kubenswrapper[17644]: I0319 12:03:59.019558 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" podUID="89cf2ee8-3664-4502-b70c-b7e0a5e92cb7" containerName="multus-admission-controller" containerID="cri-o://fd263d596db29c9074c9bdeb64bbf7299d71e22e2b7ef560f862c8a5aa1f42ef" gracePeriod=30 Mar 19 12:03:59.019839 master-0 kubenswrapper[17644]: I0319 12:03:59.019716 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" podUID="89cf2ee8-3664-4502-b70c-b7e0a5e92cb7" containerName="kube-rbac-proxy" containerID="cri-o://0afce24a5d5f93336e577364d7c0df2f3a4ed2cf2501e8357b1b537f30d7ce5e" gracePeriod=30 Mar 19 12:03:59.956555 master-0 kubenswrapper[17644]: I0319 12:03:59.956432 17644 generic.go:334] "Generic (PLEG): container finished" podID="89cf2ee8-3664-4502-b70c-b7e0a5e92cb7" containerID="0afce24a5d5f93336e577364d7c0df2f3a4ed2cf2501e8357b1b537f30d7ce5e" exitCode=0 Mar 19 12:03:59.956555 master-0 kubenswrapper[17644]: I0319 12:03:59.956500 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" event={"ID":"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7","Type":"ContainerDied","Data":"0afce24a5d5f93336e577364d7c0df2f3a4ed2cf2501e8357b1b537f30d7ce5e"} Mar 19 12:04:01.984702 master-0 kubenswrapper[17644]: I0319 12:04:01.984655 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8","Type":"ContainerStarted","Data":"7f2b5ea288b54601e650c6ff62459c9ee6250acdb744f9a5c0b44b5469782a4a"} Mar 19 12:04:01.985060 master-0 kubenswrapper[17644]: I0319 12:04:01.984717 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8","Type":"ContainerStarted","Data":"3363d72a93f53effdf1582ca8bebe3f83cc8a44bae0001ada896373248911862"} Mar 19 12:04:01.985060 master-0 kubenswrapper[17644]: I0319 12:04:01.984741 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8","Type":"ContainerStarted","Data":"880632dd345de6706363ebcb7ea267f4f31bac46efa67ff9fad79f362a13e5ae"} Mar 19 12:04:01.985060 master-0 kubenswrapper[17644]: I0319 12:04:01.984753 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8","Type":"ContainerStarted","Data":"eb175aac5718f4058eadd585a688cb34e8c72deb365d141c863bb48284d27075"} Mar 19 12:04:01.985060 master-0 kubenswrapper[17644]: I0319 12:04:01.984762 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8","Type":"ContainerStarted","Data":"2543d1abea93d0c704c4c0db59eb4a042ce21c8d13bf468370e1bf83e2ab8472"} Mar 19 12:04:01.989283 master-0 kubenswrapper[17644]: I0319 12:04:01.989254 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e","Type":"ContainerStarted","Data":"f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0"} Mar 19 12:04:01.989283 master-0 kubenswrapper[17644]: I0319 12:04:01.989286 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e","Type":"ContainerStarted","Data":"d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698"} Mar 19 12:04:01.989419 master-0 kubenswrapper[17644]: I0319 12:04:01.989296 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e","Type":"ContainerStarted","Data":"859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e"} Mar 19 12:04:01.989419 master-0 kubenswrapper[17644]: I0319 12:04:01.989304 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e","Type":"ContainerStarted","Data":"c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce"} Mar 19 12:04:01.989419 master-0 kubenswrapper[17644]: I0319 12:04:01.989314 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e","Type":"ContainerStarted","Data":"39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd"} Mar 19 12:04:02.971408 master-0 kubenswrapper[17644]: E0319 12:04:02.971335 17644 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[trusted-ca], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" podUID="d2fd7597-cd7a-4138-bb3c-01681c569bd3" Mar 19 12:04:03.010799 master-0 kubenswrapper[17644]: I0319 12:04:03.010051 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8","Type":"ContainerStarted","Data":"a8631f6d1fc66da871d0154b3f1477651cfe5d30c25d45308d569545d8f58367"} Mar 19 12:04:03.017304 master-0 kubenswrapper[17644]: I0319 12:04:03.017253 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e","Type":"ContainerStarted","Data":"9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19"} Mar 19 12:04:03.017304 master-0 kubenswrapper[17644]: I0319 12:04:03.017288 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 12:04:03.043562 master-0 kubenswrapper[17644]: I0319 12:04:03.043477 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=20.225488475 podStartE2EDuration="24.043453596s" podCreationTimestamp="2026-03-19 12:03:39 +0000 UTC" firstStartedPulling="2026-03-19 12:03:55.909811015 +0000 UTC m=+269.679769050" lastFinishedPulling="2026-03-19 12:03:59.727776146 +0000 UTC m=+273.497734171" observedRunningTime="2026-03-19 12:04:03.040322259 +0000 UTC m=+276.810280314" watchObservedRunningTime="2026-03-19 12:04:03.043453596 +0000 UTC m=+276.813411641" Mar 19 12:04:03.085635 master-0 kubenswrapper[17644]: I0319 12:04:03.085548 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=11.033048306 podStartE2EDuration="19.085527144s" podCreationTimestamp="2026-03-19 12:03:44 +0000 UTC" firstStartedPulling="2026-03-19 12:03:52.99168116 +0000 UTC m=+266.761639195" lastFinishedPulling="2026-03-19 12:04:01.044159998 +0000 UTC m=+274.814118033" observedRunningTime="2026-03-19 12:04:03.083669339 +0000 UTC m=+276.853627394" watchObservedRunningTime="2026-03-19 12:04:03.085527144 +0000 UTC m=+276.855485199" Mar 19 12:04:03.305462 master-0 kubenswrapper[17644]: I0319 12:04:03.305329 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:04:03.305462 master-0 kubenswrapper[17644]: I0319 12:04:03.305420 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:04:06.442695 master-0 kubenswrapper[17644]: I0319 12:04:06.442609 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca\") pod \"console-operator-76b6568d85-8bvjj\" (UID: \"d2fd7597-cd7a-4138-bb3c-01681c569bd3\") " pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 12:04:06.444384 master-0 kubenswrapper[17644]: I0319 12:04:06.444327 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2fd7597-cd7a-4138-bb3c-01681c569bd3-trusted-ca\") pod \"console-operator-76b6568d85-8bvjj\" (UID: \"d2fd7597-cd7a-4138-bb3c-01681c569bd3\") " pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 12:04:06.619089 master-0 kubenswrapper[17644]: I0319 12:04:06.619004 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 12:04:07.108402 master-0 kubenswrapper[17644]: I0319 12:04:07.108331 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-76b6568d85-8bvjj"] Mar 19 12:04:07.555040 master-0 kubenswrapper[17644]: I0319 12:04:07.554948 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:04:07.928359 master-0 kubenswrapper[17644]: E0319 12:04:07.928140 17644 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02a9c97b78bcee0c2ed5ad248cab444495662a7f271c004b29656d38d9609c20" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 12:04:07.929237 master-0 kubenswrapper[17644]: E0319 12:04:07.929193 17644 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02a9c97b78bcee0c2ed5ad248cab444495662a7f271c004b29656d38d9609c20" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 12:04:07.931155 master-0 kubenswrapper[17644]: E0319 12:04:07.931110 17644 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02a9c97b78bcee0c2ed5ad248cab444495662a7f271c004b29656d38d9609c20" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 12:04:07.931229 master-0 kubenswrapper[17644]: E0319 12:04:07.931156 17644 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" podUID="69264375-c63b-4be0-80b9-52aefeca1382" containerName="kube-multus-additional-cni-plugins" Mar 19 12:04:08.071034 master-0 kubenswrapper[17644]: I0319 12:04:08.070934 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" event={"ID":"d2fd7597-cd7a-4138-bb3c-01681c569bd3","Type":"ContainerStarted","Data":"31b14dee0e87f1b3514c08236a600e9fa30043ecbcf36390f2a3a41d2ee2fe17"} Mar 19 12:04:11.092170 master-0 kubenswrapper[17644]: I0319 12:04:11.092104 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" event={"ID":"d2fd7597-cd7a-4138-bb3c-01681c569bd3","Type":"ContainerStarted","Data":"d3680a3e973473ebbe62e7541c6d05f099bc65de60dda2a4886bf757c3aa0b72"} Mar 19 12:04:11.092785 master-0 kubenswrapper[17644]: I0319 12:04:11.092438 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 12:04:11.118857 master-0 kubenswrapper[17644]: I0319 12:04:11.118648 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" podStartSLOduration=251.379787084 podStartE2EDuration="4m15.118630613s" podCreationTimestamp="2026-03-19 11:59:56 +0000 UTC" firstStartedPulling="2026-03-19 12:04:07.112202343 +0000 UTC m=+280.882160378" lastFinishedPulling="2026-03-19 12:04:10.851045872 +0000 UTC m=+284.621003907" observedRunningTime="2026-03-19 12:04:11.115400973 +0000 UTC m=+284.885359018" watchObservedRunningTime="2026-03-19 12:04:11.118630613 +0000 UTC m=+284.888588648" Mar 19 12:04:11.406579 master-0 kubenswrapper[17644]: I0319 12:04:11.406457 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-76b6568d85-8bvjj" Mar 19 12:04:11.626674 master-0 kubenswrapper[17644]: I0319 12:04:11.626618 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-66b8ffb895-vqnnc"] Mar 19 12:04:11.627620 master-0 kubenswrapper[17644]: I0319 12:04:11.627598 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-66b8ffb895-vqnnc" Mar 19 12:04:11.632152 master-0 kubenswrapper[17644]: I0319 12:04:11.632115 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 12:04:11.632459 master-0 kubenswrapper[17644]: I0319 12:04:11.632396 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 12:04:11.643988 master-0 kubenswrapper[17644]: I0319 12:04:11.643908 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-kb5zl" Mar 19 12:04:11.644873 master-0 kubenswrapper[17644]: I0319 12:04:11.644830 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-66b8ffb895-vqnnc"] Mar 19 12:04:11.655778 master-0 kubenswrapper[17644]: I0319 12:04:11.655703 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb466\" (UniqueName: \"kubernetes.io/projected/e17d22fe-fe0f-448e-9666-882d888d3ad4-kube-api-access-lb466\") pod \"downloads-66b8ffb895-vqnnc\" (UID: \"e17d22fe-fe0f-448e-9666-882d888d3ad4\") " pod="openshift-console/downloads-66b8ffb895-vqnnc" Mar 19 12:04:11.757814 master-0 kubenswrapper[17644]: I0319 12:04:11.756646 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb466\" (UniqueName: \"kubernetes.io/projected/e17d22fe-fe0f-448e-9666-882d888d3ad4-kube-api-access-lb466\") pod \"downloads-66b8ffb895-vqnnc\" (UID: \"e17d22fe-fe0f-448e-9666-882d888d3ad4\") " pod="openshift-console/downloads-66b8ffb895-vqnnc" Mar 19 12:04:11.778828 master-0 kubenswrapper[17644]: I0319 12:04:11.776614 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb466\" (UniqueName: \"kubernetes.io/projected/e17d22fe-fe0f-448e-9666-882d888d3ad4-kube-api-access-lb466\") pod \"downloads-66b8ffb895-vqnnc\" (UID: \"e17d22fe-fe0f-448e-9666-882d888d3ad4\") " pod="openshift-console/downloads-66b8ffb895-vqnnc" Mar 19 12:04:11.950894 master-0 kubenswrapper[17644]: I0319 12:04:11.950837 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-66b8ffb895-vqnnc" Mar 19 12:04:12.438767 master-0 kubenswrapper[17644]: I0319 12:04:12.438670 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-66b8ffb895-vqnnc"] Mar 19 12:04:13.109999 master-0 kubenswrapper[17644]: I0319 12:04:13.109549 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-66b8ffb895-vqnnc" event={"ID":"e17d22fe-fe0f-448e-9666-882d888d3ad4","Type":"ContainerStarted","Data":"a1056ae4a6a35478d735595e347f7b0c473719efb8c32e275989e81a90027782"} Mar 19 12:04:14.958079 master-0 kubenswrapper[17644]: I0319 12:04:14.958022 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-7c6b76c555-fwwjc"] Mar 19 12:04:14.964594 master-0 kubenswrapper[17644]: I0319 12:04:14.958998 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7c6b76c555-fwwjc" Mar 19 12:04:14.972749 master-0 kubenswrapper[17644]: I0319 12:04:14.971818 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 12:04:14.972749 master-0 kubenswrapper[17644]: I0319 12:04:14.972063 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 12:04:15.002754 master-0 kubenswrapper[17644]: I0319 12:04:14.998925 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-7c6b76c555-fwwjc"] Mar 19 12:04:15.118765 master-0 kubenswrapper[17644]: I0319 12:04:15.118696 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3508b92c-96e8-4454-b754-35dc5f1eee81-nginx-conf\") pod \"networking-console-plugin-7c6b76c555-fwwjc\" (UID: \"3508b92c-96e8-4454-b754-35dc5f1eee81\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-fwwjc" Mar 19 12:04:15.118981 master-0 kubenswrapper[17644]: I0319 12:04:15.118859 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3508b92c-96e8-4454-b754-35dc5f1eee81-networking-console-plugin-cert\") pod \"networking-console-plugin-7c6b76c555-fwwjc\" (UID: \"3508b92c-96e8-4454-b754-35dc5f1eee81\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-fwwjc" Mar 19 12:04:15.220797 master-0 kubenswrapper[17644]: I0319 12:04:15.220624 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3508b92c-96e8-4454-b754-35dc5f1eee81-networking-console-plugin-cert\") pod \"networking-console-plugin-7c6b76c555-fwwjc\" (UID: \"3508b92c-96e8-4454-b754-35dc5f1eee81\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-fwwjc" Mar 19 12:04:15.220797 master-0 kubenswrapper[17644]: I0319 12:04:15.220702 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3508b92c-96e8-4454-b754-35dc5f1eee81-nginx-conf\") pod \"networking-console-plugin-7c6b76c555-fwwjc\" (UID: \"3508b92c-96e8-4454-b754-35dc5f1eee81\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-fwwjc" Mar 19 12:04:15.221134 master-0 kubenswrapper[17644]: E0319 12:04:15.221047 17644 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 19 12:04:15.221242 master-0 kubenswrapper[17644]: E0319 12:04:15.221217 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3508b92c-96e8-4454-b754-35dc5f1eee81-networking-console-plugin-cert podName:3508b92c-96e8-4454-b754-35dc5f1eee81 nodeName:}" failed. No retries permitted until 2026-03-19 12:04:15.721181174 +0000 UTC m=+289.491139209 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3508b92c-96e8-4454-b754-35dc5f1eee81-networking-console-plugin-cert") pod "networking-console-plugin-7c6b76c555-fwwjc" (UID: "3508b92c-96e8-4454-b754-35dc5f1eee81") : secret "networking-console-plugin-cert" not found Mar 19 12:04:15.222044 master-0 kubenswrapper[17644]: I0319 12:04:15.222010 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3508b92c-96e8-4454-b754-35dc5f1eee81-nginx-conf\") pod \"networking-console-plugin-7c6b76c555-fwwjc\" (UID: \"3508b92c-96e8-4454-b754-35dc5f1eee81\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-fwwjc" Mar 19 12:04:15.728824 master-0 kubenswrapper[17644]: I0319 12:04:15.728739 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3508b92c-96e8-4454-b754-35dc5f1eee81-networking-console-plugin-cert\") pod \"networking-console-plugin-7c6b76c555-fwwjc\" (UID: \"3508b92c-96e8-4454-b754-35dc5f1eee81\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-fwwjc" Mar 19 12:04:15.733430 master-0 kubenswrapper[17644]: I0319 12:04:15.733384 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3508b92c-96e8-4454-b754-35dc5f1eee81-networking-console-plugin-cert\") pod \"networking-console-plugin-7c6b76c555-fwwjc\" (UID: \"3508b92c-96e8-4454-b754-35dc5f1eee81\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-fwwjc" Mar 19 12:04:15.942842 master-0 kubenswrapper[17644]: I0319 12:04:15.942788 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7c6b76c555-fwwjc" Mar 19 12:04:16.521255 master-0 kubenswrapper[17644]: I0319 12:04:16.521189 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-7c6b76c555-fwwjc"] Mar 19 12:04:16.523670 master-0 kubenswrapper[17644]: W0319 12:04:16.523610 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3508b92c_96e8_4454_b754_35dc5f1eee81.slice/crio-82569dffe3bd6f22e42f1356dfc14426853cc6f18211c331520f3ef27935f4b1 WatchSource:0}: Error finding container 82569dffe3bd6f22e42f1356dfc14426853cc6f18211c331520f3ef27935f4b1: Status 404 returned error can't find the container with id 82569dffe3bd6f22e42f1356dfc14426853cc6f18211c331520f3ef27935f4b1 Mar 19 12:04:17.182092 master-0 kubenswrapper[17644]: I0319 12:04:17.182016 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-7c6b76c555-fwwjc" event={"ID":"3508b92c-96e8-4454-b754-35dc5f1eee81","Type":"ContainerStarted","Data":"82569dffe3bd6f22e42f1356dfc14426853cc6f18211c331520f3ef27935f4b1"} Mar 19 12:04:17.930543 master-0 kubenswrapper[17644]: E0319 12:04:17.930275 17644 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02a9c97b78bcee0c2ed5ad248cab444495662a7f271c004b29656d38d9609c20" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 12:04:17.941796 master-0 kubenswrapper[17644]: E0319 12:04:17.934259 17644 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02a9c97b78bcee0c2ed5ad248cab444495662a7f271c004b29656d38d9609c20" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 12:04:17.941796 master-0 kubenswrapper[17644]: E0319 12:04:17.936513 17644 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02a9c97b78bcee0c2ed5ad248cab444495662a7f271c004b29656d38d9609c20" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 12:04:17.941796 master-0 kubenswrapper[17644]: E0319 12:04:17.936581 17644 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" podUID="69264375-c63b-4be0-80b9-52aefeca1382" containerName="kube-multus-additional-cni-plugins" Mar 19 12:04:18.498123 master-0 kubenswrapper[17644]: I0319 12:04:18.498060 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:04:19.207437 master-0 kubenswrapper[17644]: I0319 12:04:19.207371 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-7c6b76c555-fwwjc" event={"ID":"3508b92c-96e8-4454-b754-35dc5f1eee81","Type":"ContainerStarted","Data":"db88429b048e84cfa24a4915ec0bd16c04bc43e98f6f76d75937f6fb05a22f7f"} Mar 19 12:04:19.269778 master-0 kubenswrapper[17644]: I0319 12:04:19.269613 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-7c6b76c555-fwwjc" podStartSLOduration=3.1028142 podStartE2EDuration="5.269587109s" podCreationTimestamp="2026-03-19 12:04:14 +0000 UTC" firstStartedPulling="2026-03-19 12:04:16.528083672 +0000 UTC m=+290.298041707" lastFinishedPulling="2026-03-19 12:04:18.694856581 +0000 UTC m=+292.464814616" observedRunningTime="2026-03-19 12:04:19.266416821 +0000 UTC m=+293.036374876" watchObservedRunningTime="2026-03-19 12:04:19.269587109 +0000 UTC m=+293.039545154" Mar 19 12:04:20.444761 master-0 kubenswrapper[17644]: I0319 12:04:20.440368 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-d9758b5c6-n2b98"] Mar 19 12:04:20.444761 master-0 kubenswrapper[17644]: I0319 12:04:20.442206 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:04:20.447261 master-0 kubenswrapper[17644]: I0319 12:04:20.446231 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 12:04:20.447261 master-0 kubenswrapper[17644]: I0319 12:04:20.446465 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 12:04:20.447261 master-0 kubenswrapper[17644]: I0319 12:04:20.446609 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 12:04:20.447261 master-0 kubenswrapper[17644]: I0319 12:04:20.446769 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-tnp7t" Mar 19 12:04:20.447261 master-0 kubenswrapper[17644]: I0319 12:04:20.447199 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 12:04:20.448599 master-0 kubenswrapper[17644]: I0319 12:04:20.448146 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f636deab-3372-4c36-b492-df2442da1e31-service-ca\") pod \"console-d9758b5c6-n2b98\" (UID: \"f636deab-3372-4c36-b492-df2442da1e31\") " pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:04:20.448599 master-0 kubenswrapper[17644]: I0319 12:04:20.448334 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f636deab-3372-4c36-b492-df2442da1e31-oauth-serving-cert\") pod \"console-d9758b5c6-n2b98\" (UID: \"f636deab-3372-4c36-b492-df2442da1e31\") " pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:04:20.448599 master-0 kubenswrapper[17644]: I0319 12:04:20.448392 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f636deab-3372-4c36-b492-df2442da1e31-console-config\") pod \"console-d9758b5c6-n2b98\" (UID: \"f636deab-3372-4c36-b492-df2442da1e31\") " pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:04:20.448599 master-0 kubenswrapper[17644]: I0319 12:04:20.448490 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f636deab-3372-4c36-b492-df2442da1e31-console-oauth-config\") pod \"console-d9758b5c6-n2b98\" (UID: \"f636deab-3372-4c36-b492-df2442da1e31\") " pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:04:20.448599 master-0 kubenswrapper[17644]: I0319 12:04:20.448553 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmlbh\" (UniqueName: \"kubernetes.io/projected/f636deab-3372-4c36-b492-df2442da1e31-kube-api-access-cmlbh\") pod \"console-d9758b5c6-n2b98\" (UID: \"f636deab-3372-4c36-b492-df2442da1e31\") " pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:04:20.448599 master-0 kubenswrapper[17644]: I0319 12:04:20.448601 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 12:04:20.449097 master-0 kubenswrapper[17644]: I0319 12:04:20.448706 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f636deab-3372-4c36-b492-df2442da1e31-console-serving-cert\") pod \"console-d9758b5c6-n2b98\" (UID: \"f636deab-3372-4c36-b492-df2442da1e31\") " pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:04:20.482199 master-0 kubenswrapper[17644]: I0319 12:04:20.482128 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d9758b5c6-n2b98"] Mar 19 12:04:20.549374 master-0 kubenswrapper[17644]: I0319 12:04:20.549201 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f636deab-3372-4c36-b492-df2442da1e31-console-oauth-config\") pod \"console-d9758b5c6-n2b98\" (UID: \"f636deab-3372-4c36-b492-df2442da1e31\") " pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:04:20.549374 master-0 kubenswrapper[17644]: I0319 12:04:20.549248 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmlbh\" (UniqueName: \"kubernetes.io/projected/f636deab-3372-4c36-b492-df2442da1e31-kube-api-access-cmlbh\") pod \"console-d9758b5c6-n2b98\" (UID: \"f636deab-3372-4c36-b492-df2442da1e31\") " pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:04:20.549778 master-0 kubenswrapper[17644]: I0319 12:04:20.549563 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f636deab-3372-4c36-b492-df2442da1e31-console-serving-cert\") pod \"console-d9758b5c6-n2b98\" (UID: \"f636deab-3372-4c36-b492-df2442da1e31\") " pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:04:20.551415 master-0 kubenswrapper[17644]: I0319 12:04:20.550389 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f636deab-3372-4c36-b492-df2442da1e31-service-ca\") pod \"console-d9758b5c6-n2b98\" (UID: \"f636deab-3372-4c36-b492-df2442da1e31\") " pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:04:20.551415 master-0 kubenswrapper[17644]: I0319 12:04:20.550683 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f636deab-3372-4c36-b492-df2442da1e31-oauth-serving-cert\") pod \"console-d9758b5c6-n2b98\" (UID: \"f636deab-3372-4c36-b492-df2442da1e31\") " pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:04:20.551415 master-0 kubenswrapper[17644]: I0319 12:04:20.550773 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f636deab-3372-4c36-b492-df2442da1e31-console-config\") pod \"console-d9758b5c6-n2b98\" (UID: \"f636deab-3372-4c36-b492-df2442da1e31\") " pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:04:20.551548 master-0 kubenswrapper[17644]: I0319 12:04:20.551506 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f636deab-3372-4c36-b492-df2442da1e31-service-ca\") pod \"console-d9758b5c6-n2b98\" (UID: \"f636deab-3372-4c36-b492-df2442da1e31\") " pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:04:20.553834 master-0 kubenswrapper[17644]: I0319 12:04:20.551950 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f636deab-3372-4c36-b492-df2442da1e31-console-config\") pod \"console-d9758b5c6-n2b98\" (UID: \"f636deab-3372-4c36-b492-df2442da1e31\") " pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:04:20.553834 master-0 kubenswrapper[17644]: I0319 12:04:20.552094 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f636deab-3372-4c36-b492-df2442da1e31-oauth-serving-cert\") pod \"console-d9758b5c6-n2b98\" (UID: \"f636deab-3372-4c36-b492-df2442da1e31\") " pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:04:20.554598 master-0 kubenswrapper[17644]: I0319 12:04:20.554541 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f636deab-3372-4c36-b492-df2442da1e31-console-serving-cert\") pod \"console-d9758b5c6-n2b98\" (UID: \"f636deab-3372-4c36-b492-df2442da1e31\") " pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:04:20.562905 master-0 kubenswrapper[17644]: I0319 12:04:20.562827 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f636deab-3372-4c36-b492-df2442da1e31-console-oauth-config\") pod \"console-d9758b5c6-n2b98\" (UID: \"f636deab-3372-4c36-b492-df2442da1e31\") " pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:04:20.567857 master-0 kubenswrapper[17644]: I0319 12:04:20.567811 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmlbh\" (UniqueName: \"kubernetes.io/projected/f636deab-3372-4c36-b492-df2442da1e31-kube-api-access-cmlbh\") pod \"console-d9758b5c6-n2b98\" (UID: \"f636deab-3372-4c36-b492-df2442da1e31\") " pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:04:20.779888 master-0 kubenswrapper[17644]: I0319 12:04:20.779032 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:04:21.418289 master-0 kubenswrapper[17644]: I0319 12:04:21.418233 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d9758b5c6-n2b98"] Mar 19 12:04:21.808057 master-0 kubenswrapper[17644]: I0319 12:04:21.807984 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 19 12:04:21.810605 master-0 kubenswrapper[17644]: I0319 12:04:21.809853 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 19 12:04:21.813368 master-0 kubenswrapper[17644]: I0319 12:04:21.813348 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 19 12:04:21.841270 master-0 kubenswrapper[17644]: I0319 12:04:21.841200 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 19 12:04:21.922088 master-0 kubenswrapper[17644]: I0319 12:04:21.922009 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ec000c4-5cc8-45b3-95ba-2856655f02f5-kube-api-access\") pod \"installer-2-master-0\" (UID: \"4ec000c4-5cc8-45b3-95ba-2856655f02f5\") " pod="openshift-etcd/installer-2-master-0" Mar 19 12:04:21.922293 master-0 kubenswrapper[17644]: I0319 12:04:21.922235 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4ec000c4-5cc8-45b3-95ba-2856655f02f5-var-lock\") pod \"installer-2-master-0\" (UID: \"4ec000c4-5cc8-45b3-95ba-2856655f02f5\") " pod="openshift-etcd/installer-2-master-0" Mar 19 12:04:21.922360 master-0 kubenswrapper[17644]: I0319 12:04:21.922348 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ec000c4-5cc8-45b3-95ba-2856655f02f5-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"4ec000c4-5cc8-45b3-95ba-2856655f02f5\") " pod="openshift-etcd/installer-2-master-0" Mar 19 12:04:21.995398 master-0 kubenswrapper[17644]: I0319 12:04:21.995326 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-b2dmc_69264375-c63b-4be0-80b9-52aefeca1382/kube-multus-additional-cni-plugins/0.log" Mar 19 12:04:21.995596 master-0 kubenswrapper[17644]: I0319 12:04:21.995457 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" Mar 19 12:04:22.023664 master-0 kubenswrapper[17644]: I0319 12:04:22.023598 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4ec000c4-5cc8-45b3-95ba-2856655f02f5-var-lock\") pod \"installer-2-master-0\" (UID: \"4ec000c4-5cc8-45b3-95ba-2856655f02f5\") " pod="openshift-etcd/installer-2-master-0" Mar 19 12:04:22.023909 master-0 kubenswrapper[17644]: I0319 12:04:22.023700 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ec000c4-5cc8-45b3-95ba-2856655f02f5-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"4ec000c4-5cc8-45b3-95ba-2856655f02f5\") " pod="openshift-etcd/installer-2-master-0" Mar 19 12:04:22.023909 master-0 kubenswrapper[17644]: I0319 12:04:22.023760 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ec000c4-5cc8-45b3-95ba-2856655f02f5-kube-api-access\") pod \"installer-2-master-0\" (UID: \"4ec000c4-5cc8-45b3-95ba-2856655f02f5\") " pod="openshift-etcd/installer-2-master-0" Mar 19 12:04:22.024308 master-0 kubenswrapper[17644]: I0319 12:04:22.024285 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4ec000c4-5cc8-45b3-95ba-2856655f02f5-var-lock\") pod \"installer-2-master-0\" (UID: \"4ec000c4-5cc8-45b3-95ba-2856655f02f5\") " pod="openshift-etcd/installer-2-master-0" Mar 19 12:04:22.024363 master-0 kubenswrapper[17644]: I0319 12:04:22.024336 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ec000c4-5cc8-45b3-95ba-2856655f02f5-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"4ec000c4-5cc8-45b3-95ba-2856655f02f5\") " pod="openshift-etcd/installer-2-master-0" Mar 19 12:04:22.040760 master-0 kubenswrapper[17644]: I0319 12:04:22.040689 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ec000c4-5cc8-45b3-95ba-2856655f02f5-kube-api-access\") pod \"installer-2-master-0\" (UID: \"4ec000c4-5cc8-45b3-95ba-2856655f02f5\") " pod="openshift-etcd/installer-2-master-0" Mar 19 12:04:22.126038 master-0 kubenswrapper[17644]: I0319 12:04:22.125437 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b59np\" (UniqueName: \"kubernetes.io/projected/69264375-c63b-4be0-80b9-52aefeca1382-kube-api-access-b59np\") pod \"69264375-c63b-4be0-80b9-52aefeca1382\" (UID: \"69264375-c63b-4be0-80b9-52aefeca1382\") " Mar 19 12:04:22.126038 master-0 kubenswrapper[17644]: I0319 12:04:22.125511 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/69264375-c63b-4be0-80b9-52aefeca1382-cni-sysctl-allowlist\") pod \"69264375-c63b-4be0-80b9-52aefeca1382\" (UID: \"69264375-c63b-4be0-80b9-52aefeca1382\") " Mar 19 12:04:22.126038 master-0 kubenswrapper[17644]: I0319 12:04:22.125568 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/69264375-c63b-4be0-80b9-52aefeca1382-ready\") pod \"69264375-c63b-4be0-80b9-52aefeca1382\" (UID: \"69264375-c63b-4be0-80b9-52aefeca1382\") " Mar 19 12:04:22.126038 master-0 kubenswrapper[17644]: I0319 12:04:22.125609 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/69264375-c63b-4be0-80b9-52aefeca1382-tuning-conf-dir\") pod \"69264375-c63b-4be0-80b9-52aefeca1382\" (UID: \"69264375-c63b-4be0-80b9-52aefeca1382\") " Mar 19 12:04:22.126038 master-0 kubenswrapper[17644]: I0319 12:04:22.125804 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69264375-c63b-4be0-80b9-52aefeca1382-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "69264375-c63b-4be0-80b9-52aefeca1382" (UID: "69264375-c63b-4be0-80b9-52aefeca1382"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:04:22.126038 master-0 kubenswrapper[17644]: I0319 12:04:22.125956 17644 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/69264375-c63b-4be0-80b9-52aefeca1382-tuning-conf-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:04:22.126431 master-0 kubenswrapper[17644]: I0319 12:04:22.126103 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69264375-c63b-4be0-80b9-52aefeca1382-ready" (OuterVolumeSpecName: "ready") pod "69264375-c63b-4be0-80b9-52aefeca1382" (UID: "69264375-c63b-4be0-80b9-52aefeca1382"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:04:22.126431 master-0 kubenswrapper[17644]: I0319 12:04:22.126200 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69264375-c63b-4be0-80b9-52aefeca1382-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "69264375-c63b-4be0-80b9-52aefeca1382" (UID: "69264375-c63b-4be0-80b9-52aefeca1382"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:04:22.136582 master-0 kubenswrapper[17644]: I0319 12:04:22.136515 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69264375-c63b-4be0-80b9-52aefeca1382-kube-api-access-b59np" (OuterVolumeSpecName: "kube-api-access-b59np") pod "69264375-c63b-4be0-80b9-52aefeca1382" (UID: "69264375-c63b-4be0-80b9-52aefeca1382"). InnerVolumeSpecName "kube-api-access-b59np". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:04:22.137866 master-0 kubenswrapper[17644]: I0319 12:04:22.137688 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 19 12:04:22.227453 master-0 kubenswrapper[17644]: I0319 12:04:22.227384 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b59np\" (UniqueName: \"kubernetes.io/projected/69264375-c63b-4be0-80b9-52aefeca1382-kube-api-access-b59np\") on node \"master-0\" DevicePath \"\"" Mar 19 12:04:22.227453 master-0 kubenswrapper[17644]: I0319 12:04:22.227450 17644 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/69264375-c63b-4be0-80b9-52aefeca1382-cni-sysctl-allowlist\") on node \"master-0\" DevicePath \"\"" Mar 19 12:04:22.227453 master-0 kubenswrapper[17644]: I0319 12:04:22.227465 17644 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/69264375-c63b-4be0-80b9-52aefeca1382-ready\") on node \"master-0\" DevicePath \"\"" Mar 19 12:04:22.232926 master-0 kubenswrapper[17644]: I0319 12:04:22.232870 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-b2dmc_69264375-c63b-4be0-80b9-52aefeca1382/kube-multus-additional-cni-plugins/0.log" Mar 19 12:04:22.233056 master-0 kubenswrapper[17644]: I0319 12:04:22.232935 17644 generic.go:334] "Generic (PLEG): container finished" podID="69264375-c63b-4be0-80b9-52aefeca1382" containerID="02a9c97b78bcee0c2ed5ad248cab444495662a7f271c004b29656d38d9609c20" exitCode=137 Mar 19 12:04:22.233056 master-0 kubenswrapper[17644]: I0319 12:04:22.233008 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" event={"ID":"69264375-c63b-4be0-80b9-52aefeca1382","Type":"ContainerDied","Data":"02a9c97b78bcee0c2ed5ad248cab444495662a7f271c004b29656d38d9609c20"} Mar 19 12:04:22.233056 master-0 kubenswrapper[17644]: I0319 12:04:22.233033 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" Mar 19 12:04:22.233231 master-0 kubenswrapper[17644]: I0319 12:04:22.233041 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-b2dmc" event={"ID":"69264375-c63b-4be0-80b9-52aefeca1382","Type":"ContainerDied","Data":"9f9d47e4dd383fdf4e789661c83b79b4e65e3937e77d461da0de6315684f5f8e"} Mar 19 12:04:22.233277 master-0 kubenswrapper[17644]: I0319 12:04:22.233057 17644 scope.go:117] "RemoveContainer" containerID="02a9c97b78bcee0c2ed5ad248cab444495662a7f271c004b29656d38d9609c20" Mar 19 12:04:22.236328 master-0 kubenswrapper[17644]: I0319 12:04:22.236237 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d9758b5c6-n2b98" event={"ID":"f636deab-3372-4c36-b492-df2442da1e31","Type":"ContainerStarted","Data":"8659f435a458173cfde862d58e5a1c333c45e91207e97205273a5b46661fbad0"} Mar 19 12:04:22.262075 master-0 kubenswrapper[17644]: I0319 12:04:22.262043 17644 scope.go:117] "RemoveContainer" containerID="02a9c97b78bcee0c2ed5ad248cab444495662a7f271c004b29656d38d9609c20" Mar 19 12:04:22.263166 master-0 kubenswrapper[17644]: E0319 12:04:22.263115 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02a9c97b78bcee0c2ed5ad248cab444495662a7f271c004b29656d38d9609c20\": container with ID starting with 02a9c97b78bcee0c2ed5ad248cab444495662a7f271c004b29656d38d9609c20 not found: ID does not exist" containerID="02a9c97b78bcee0c2ed5ad248cab444495662a7f271c004b29656d38d9609c20" Mar 19 12:04:22.263251 master-0 kubenswrapper[17644]: I0319 12:04:22.263169 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02a9c97b78bcee0c2ed5ad248cab444495662a7f271c004b29656d38d9609c20"} err="failed to get container status \"02a9c97b78bcee0c2ed5ad248cab444495662a7f271c004b29656d38d9609c20\": rpc error: code = NotFound desc = could not find container \"02a9c97b78bcee0c2ed5ad248cab444495662a7f271c004b29656d38d9609c20\": container with ID starting with 02a9c97b78bcee0c2ed5ad248cab444495662a7f271c004b29656d38d9609c20 not found: ID does not exist" Mar 19 12:04:22.275167 master-0 kubenswrapper[17644]: I0319 12:04:22.274163 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-b2dmc"] Mar 19 12:04:22.281871 master-0 kubenswrapper[17644]: I0319 12:04:22.281808 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-b2dmc"] Mar 19 12:04:22.493890 master-0 kubenswrapper[17644]: I0319 12:04:22.493674 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69264375-c63b-4be0-80b9-52aefeca1382" path="/var/lib/kubelet/pods/69264375-c63b-4be0-80b9-52aefeca1382/volumes" Mar 19 12:04:22.666595 master-0 kubenswrapper[17644]: W0319 12:04:22.666544 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4ec000c4_5cc8_45b3_95ba_2856655f02f5.slice/crio-5c721c029071b39ac77069956f4b8a9993362193cc4ff7085dec746b764d225e WatchSource:0}: Error finding container 5c721c029071b39ac77069956f4b8a9993362193cc4ff7085dec746b764d225e: Status 404 returned error can't find the container with id 5c721c029071b39ac77069956f4b8a9993362193cc4ff7085dec746b764d225e Mar 19 12:04:22.667757 master-0 kubenswrapper[17644]: I0319 12:04:22.667690 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 19 12:04:23.257622 master-0 kubenswrapper[17644]: I0319 12:04:23.255818 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"4ec000c4-5cc8-45b3-95ba-2856655f02f5","Type":"ContainerStarted","Data":"2954bb26fffbeed69600aa01dfdf7f588c1039c56fa3503890cba6c8da239e98"} Mar 19 12:04:23.257622 master-0 kubenswrapper[17644]: I0319 12:04:23.255866 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"4ec000c4-5cc8-45b3-95ba-2856655f02f5","Type":"ContainerStarted","Data":"5c721c029071b39ac77069956f4b8a9993362193cc4ff7085dec746b764d225e"} Mar 19 12:04:23.290418 master-0 kubenswrapper[17644]: I0319 12:04:23.290328 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-2-master-0" podStartSLOduration=2.290302971 podStartE2EDuration="2.290302971s" podCreationTimestamp="2026-03-19 12:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:04:23.281193776 +0000 UTC m=+297.051151831" watchObservedRunningTime="2026-03-19 12:04:23.290302971 +0000 UTC m=+297.060261006" Mar 19 12:04:23.309831 master-0 kubenswrapper[17644]: I0319 12:04:23.309774 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:04:23.320660 master-0 kubenswrapper[17644]: I0319 12:04:23.320613 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7f479f8754-7s22b" Mar 19 12:04:26.673983 master-0 kubenswrapper[17644]: I0319 12:04:26.673915 17644 kubelet.go:1505] "Image garbage collection succeeded" Mar 19 12:04:27.344108 master-0 kubenswrapper[17644]: I0319 12:04:27.344004 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d9758b5c6-n2b98" event={"ID":"f636deab-3372-4c36-b492-df2442da1e31","Type":"ContainerStarted","Data":"240d62d6290746118c3fa599222f62e950c62cff164d71a917712346ad5fd3da"} Mar 19 12:04:27.387753 master-0 kubenswrapper[17644]: I0319 12:04:27.377958 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-d9758b5c6-n2b98" podStartSLOduration=2.069922477 podStartE2EDuration="7.377941974s" podCreationTimestamp="2026-03-19 12:04:20 +0000 UTC" firstStartedPulling="2026-03-19 12:04:21.408141762 +0000 UTC m=+295.178099797" lastFinishedPulling="2026-03-19 12:04:26.716161259 +0000 UTC m=+300.486119294" observedRunningTime="2026-03-19 12:04:27.370139651 +0000 UTC m=+301.140097696" watchObservedRunningTime="2026-03-19 12:04:27.377941974 +0000 UTC m=+301.147899999" Mar 19 12:04:27.572199 master-0 kubenswrapper[17644]: I0319 12:04:27.572123 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-6-master-0"] Mar 19 12:04:27.572479 master-0 kubenswrapper[17644]: E0319 12:04:27.572456 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69264375-c63b-4be0-80b9-52aefeca1382" containerName="kube-multus-additional-cni-plugins" Mar 19 12:04:27.572479 master-0 kubenswrapper[17644]: I0319 12:04:27.572471 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="69264375-c63b-4be0-80b9-52aefeca1382" containerName="kube-multus-additional-cni-plugins" Mar 19 12:04:27.572669 master-0 kubenswrapper[17644]: I0319 12:04:27.572649 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="69264375-c63b-4be0-80b9-52aefeca1382" containerName="kube-multus-additional-cni-plugins" Mar 19 12:04:27.573167 master-0 kubenswrapper[17644]: I0319 12:04:27.573142 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:04:27.576250 master-0 kubenswrapper[17644]: I0319 12:04:27.576217 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-kpv7f" Mar 19 12:04:27.581246 master-0 kubenswrapper[17644]: I0319 12:04:27.576586 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 12:04:27.586744 master-0 kubenswrapper[17644]: I0319 12:04:27.586494 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-6-master-0"] Mar 19 12:04:27.621825 master-0 kubenswrapper[17644]: I0319 12:04:27.621716 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dff9f91a-2293-4b2d-95dd-be0f9152984e-kube-api-access\") pod \"installer-6-master-0\" (UID: \"dff9f91a-2293-4b2d-95dd-be0f9152984e\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:04:27.621825 master-0 kubenswrapper[17644]: I0319 12:04:27.621795 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dff9f91a-2293-4b2d-95dd-be0f9152984e-var-lock\") pod \"installer-6-master-0\" (UID: \"dff9f91a-2293-4b2d-95dd-be0f9152984e\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:04:27.621825 master-0 kubenswrapper[17644]: I0319 12:04:27.621815 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dff9f91a-2293-4b2d-95dd-be0f9152984e-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"dff9f91a-2293-4b2d-95dd-be0f9152984e\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:04:27.723017 master-0 kubenswrapper[17644]: I0319 12:04:27.722962 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dff9f91a-2293-4b2d-95dd-be0f9152984e-kube-api-access\") pod \"installer-6-master-0\" (UID: \"dff9f91a-2293-4b2d-95dd-be0f9152984e\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:04:27.723017 master-0 kubenswrapper[17644]: I0319 12:04:27.723013 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dff9f91a-2293-4b2d-95dd-be0f9152984e-var-lock\") pod \"installer-6-master-0\" (UID: \"dff9f91a-2293-4b2d-95dd-be0f9152984e\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:04:27.723017 master-0 kubenswrapper[17644]: I0319 12:04:27.723032 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dff9f91a-2293-4b2d-95dd-be0f9152984e-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"dff9f91a-2293-4b2d-95dd-be0f9152984e\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:04:27.723786 master-0 kubenswrapper[17644]: I0319 12:04:27.723093 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dff9f91a-2293-4b2d-95dd-be0f9152984e-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"dff9f91a-2293-4b2d-95dd-be0f9152984e\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:04:27.723786 master-0 kubenswrapper[17644]: I0319 12:04:27.723357 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dff9f91a-2293-4b2d-95dd-be0f9152984e-var-lock\") pod \"installer-6-master-0\" (UID: \"dff9f91a-2293-4b2d-95dd-be0f9152984e\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:04:27.741909 master-0 kubenswrapper[17644]: I0319 12:04:27.741861 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dff9f91a-2293-4b2d-95dd-be0f9152984e-kube-api-access\") pod \"installer-6-master-0\" (UID: \"dff9f91a-2293-4b2d-95dd-be0f9152984e\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:04:27.867746 master-0 kubenswrapper[17644]: I0319 12:04:27.863916 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7db659c55f-mfdrv"] Mar 19 12:04:27.867746 master-0 kubenswrapper[17644]: I0319 12:04:27.864925 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:27.882231 master-0 kubenswrapper[17644]: I0319 12:04:27.880943 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7db659c55f-mfdrv"] Mar 19 12:04:27.888472 master-0 kubenswrapper[17644]: I0319 12:04:27.888255 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 12:04:27.898900 master-0 kubenswrapper[17644]: I0319 12:04:27.898824 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:04:27.927336 master-0 kubenswrapper[17644]: I0319 12:04:27.927146 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5696b619-f43e-47a7-b557-5a6abc07cd2a-oauth-serving-cert\") pod \"console-7db659c55f-mfdrv\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:27.927537 master-0 kubenswrapper[17644]: I0319 12:04:27.927334 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5696b619-f43e-47a7-b557-5a6abc07cd2a-console-serving-cert\") pod \"console-7db659c55f-mfdrv\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:27.927575 master-0 kubenswrapper[17644]: I0319 12:04:27.927530 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66v85\" (UniqueName: \"kubernetes.io/projected/5696b619-f43e-47a7-b557-5a6abc07cd2a-kube-api-access-66v85\") pod \"console-7db659c55f-mfdrv\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:27.927609 master-0 kubenswrapper[17644]: I0319 12:04:27.927581 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5696b619-f43e-47a7-b557-5a6abc07cd2a-console-oauth-config\") pod \"console-7db659c55f-mfdrv\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:27.927746 master-0 kubenswrapper[17644]: I0319 12:04:27.927693 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5696b619-f43e-47a7-b557-5a6abc07cd2a-trusted-ca-bundle\") pod \"console-7db659c55f-mfdrv\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:27.927893 master-0 kubenswrapper[17644]: I0319 12:04:27.927858 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5696b619-f43e-47a7-b557-5a6abc07cd2a-console-config\") pod \"console-7db659c55f-mfdrv\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:27.927973 master-0 kubenswrapper[17644]: I0319 12:04:27.927900 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5696b619-f43e-47a7-b557-5a6abc07cd2a-service-ca\") pod \"console-7db659c55f-mfdrv\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:28.029651 master-0 kubenswrapper[17644]: I0319 12:04:28.029540 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5696b619-f43e-47a7-b557-5a6abc07cd2a-console-serving-cert\") pod \"console-7db659c55f-mfdrv\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:28.029871 master-0 kubenswrapper[17644]: I0319 12:04:28.029669 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66v85\" (UniqueName: \"kubernetes.io/projected/5696b619-f43e-47a7-b557-5a6abc07cd2a-kube-api-access-66v85\") pod \"console-7db659c55f-mfdrv\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:28.029871 master-0 kubenswrapper[17644]: I0319 12:04:28.029708 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5696b619-f43e-47a7-b557-5a6abc07cd2a-console-oauth-config\") pod \"console-7db659c55f-mfdrv\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:28.029871 master-0 kubenswrapper[17644]: I0319 12:04:28.029774 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5696b619-f43e-47a7-b557-5a6abc07cd2a-trusted-ca-bundle\") pod \"console-7db659c55f-mfdrv\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:28.029871 master-0 kubenswrapper[17644]: I0319 12:04:28.029843 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5696b619-f43e-47a7-b557-5a6abc07cd2a-console-config\") pod \"console-7db659c55f-mfdrv\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:28.030026 master-0 kubenswrapper[17644]: I0319 12:04:28.029890 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5696b619-f43e-47a7-b557-5a6abc07cd2a-service-ca\") pod \"console-7db659c55f-mfdrv\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:28.030026 master-0 kubenswrapper[17644]: I0319 12:04:28.029965 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5696b619-f43e-47a7-b557-5a6abc07cd2a-oauth-serving-cert\") pod \"console-7db659c55f-mfdrv\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:28.031478 master-0 kubenswrapper[17644]: I0319 12:04:28.031434 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5696b619-f43e-47a7-b557-5a6abc07cd2a-oauth-serving-cert\") pod \"console-7db659c55f-mfdrv\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:28.033411 master-0 kubenswrapper[17644]: I0319 12:04:28.033158 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5696b619-f43e-47a7-b557-5a6abc07cd2a-service-ca\") pod \"console-7db659c55f-mfdrv\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:28.034388 master-0 kubenswrapper[17644]: I0319 12:04:28.034338 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5696b619-f43e-47a7-b557-5a6abc07cd2a-console-config\") pod \"console-7db659c55f-mfdrv\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:28.038104 master-0 kubenswrapper[17644]: I0319 12:04:28.038037 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5696b619-f43e-47a7-b557-5a6abc07cd2a-console-oauth-config\") pod \"console-7db659c55f-mfdrv\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:28.039235 master-0 kubenswrapper[17644]: I0319 12:04:28.039181 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5696b619-f43e-47a7-b557-5a6abc07cd2a-console-serving-cert\") pod \"console-7db659c55f-mfdrv\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:28.040537 master-0 kubenswrapper[17644]: I0319 12:04:28.040480 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5696b619-f43e-47a7-b557-5a6abc07cd2a-trusted-ca-bundle\") pod \"console-7db659c55f-mfdrv\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:28.052712 master-0 kubenswrapper[17644]: I0319 12:04:28.052668 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66v85\" (UniqueName: \"kubernetes.io/projected/5696b619-f43e-47a7-b557-5a6abc07cd2a-kube-api-access-66v85\") pod \"console-7db659c55f-mfdrv\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:28.202966 master-0 kubenswrapper[17644]: I0319 12:04:28.202901 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:28.470712 master-0 kubenswrapper[17644]: I0319 12:04:28.470670 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-6-master-0"] Mar 19 12:04:28.593105 master-0 kubenswrapper[17644]: I0319 12:04:28.592069 17644 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 19 12:04:28.593105 master-0 kubenswrapper[17644]: I0319 12:04:28.592316 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="cluster-policy-controller" containerID="cri-o://6ecb192a1cfeb4529102ad33aeed1229502ac0d4a0688a01c8e90bffa6cdc39c" gracePeriod=30 Mar 19 12:04:28.593105 master-0 kubenswrapper[17644]: I0319 12:04:28.592473 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" containerID="cri-o://fa33151970d752ef2161babaa56491652362bb6f1d5e173d5390c7f59b36f27d" gracePeriod=30 Mar 19 12:04:28.595389 master-0 kubenswrapper[17644]: I0319 12:04:28.595192 17644 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 12:04:28.596237 master-0 kubenswrapper[17644]: E0319 12:04:28.595856 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 12:04:28.596237 master-0 kubenswrapper[17644]: I0319 12:04:28.595886 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 12:04:28.596237 master-0 kubenswrapper[17644]: E0319 12:04:28.596001 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 12:04:28.596237 master-0 kubenswrapper[17644]: I0319 12:04:28.596013 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 12:04:28.596237 master-0 kubenswrapper[17644]: E0319 12:04:28.596037 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f265536aba6292ead501bc9b49f327" containerName="cluster-policy-controller" Mar 19 12:04:28.596237 master-0 kubenswrapper[17644]: I0319 12:04:28.596047 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f265536aba6292ead501bc9b49f327" containerName="cluster-policy-controller" Mar 19 12:04:28.596237 master-0 kubenswrapper[17644]: E0319 12:04:28.596070 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 12:04:28.596237 master-0 kubenswrapper[17644]: I0319 12:04:28.596080 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 12:04:28.597339 master-0 kubenswrapper[17644]: I0319 12:04:28.596673 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 12:04:28.597339 master-0 kubenswrapper[17644]: I0319 12:04:28.596739 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 12:04:28.597339 master-0 kubenswrapper[17644]: I0319 12:04:28.596759 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 12:04:28.597339 master-0 kubenswrapper[17644]: I0319 12:04:28.596780 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f265536aba6292ead501bc9b49f327" containerName="cluster-policy-controller" Mar 19 12:04:28.599035 master-0 kubenswrapper[17644]: I0319 12:04:28.598961 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:04:28.640838 master-0 kubenswrapper[17644]: I0319 12:04:28.640604 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/607a35c2a34325129014a178207e606c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"607a35c2a34325129014a178207e606c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:04:28.640838 master-0 kubenswrapper[17644]: I0319 12:04:28.640687 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/607a35c2a34325129014a178207e606c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"607a35c2a34325129014a178207e606c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:04:28.719054 master-0 kubenswrapper[17644]: I0319 12:04:28.719007 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 12:04:28.742638 master-0 kubenswrapper[17644]: I0319 12:04:28.742595 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/607a35c2a34325129014a178207e606c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"607a35c2a34325129014a178207e606c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:04:28.743372 master-0 kubenswrapper[17644]: I0319 12:04:28.742961 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/607a35c2a34325129014a178207e606c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"607a35c2a34325129014a178207e606c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:04:28.743519 master-0 kubenswrapper[17644]: I0319 12:04:28.743479 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/607a35c2a34325129014a178207e606c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"607a35c2a34325129014a178207e606c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:04:28.745002 master-0 kubenswrapper[17644]: I0319 12:04:28.744865 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/607a35c2a34325129014a178207e606c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"607a35c2a34325129014a178207e606c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:04:28.749635 master-0 kubenswrapper[17644]: I0319 12:04:28.749600 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 12:04:28.762643 master-0 kubenswrapper[17644]: I0319 12:04:28.762553 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7db659c55f-mfdrv"] Mar 19 12:04:28.780787 master-0 kubenswrapper[17644]: I0319 12:04:28.780452 17644 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="6125acf0-c600-44c7-a816-38c41fcfebee" Mar 19 12:04:28.846825 master-0 kubenswrapper[17644]: I0319 12:04:28.845416 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"46f265536aba6292ead501bc9b49f327\" (UID: \"46f265536aba6292ead501bc9b49f327\") " Mar 19 12:04:28.846825 master-0 kubenswrapper[17644]: I0319 12:04:28.845513 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "46f265536aba6292ead501bc9b49f327" (UID: "46f265536aba6292ead501bc9b49f327"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:04:28.846825 master-0 kubenswrapper[17644]: I0319 12:04:28.845626 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"46f265536aba6292ead501bc9b49f327\" (UID: \"46f265536aba6292ead501bc9b49f327\") " Mar 19 12:04:28.846825 master-0 kubenswrapper[17644]: I0319 12:04:28.845707 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"46f265536aba6292ead501bc9b49f327\" (UID: \"46f265536aba6292ead501bc9b49f327\") " Mar 19 12:04:28.846825 master-0 kubenswrapper[17644]: I0319 12:04:28.845772 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"46f265536aba6292ead501bc9b49f327\" (UID: \"46f265536aba6292ead501bc9b49f327\") " Mar 19 12:04:28.846825 master-0 kubenswrapper[17644]: I0319 12:04:28.845803 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"46f265536aba6292ead501bc9b49f327\" (UID: \"46f265536aba6292ead501bc9b49f327\") " Mar 19 12:04:28.846825 master-0 kubenswrapper[17644]: I0319 12:04:28.846270 17644 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Mar 19 12:04:28.846825 master-0 kubenswrapper[17644]: I0319 12:04:28.846306 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs" (OuterVolumeSpecName: "logs") pod "46f265536aba6292ead501bc9b49f327" (UID: "46f265536aba6292ead501bc9b49f327"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:04:28.846825 master-0 kubenswrapper[17644]: I0319 12:04:28.846329 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "46f265536aba6292ead501bc9b49f327" (UID: "46f265536aba6292ead501bc9b49f327"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:04:28.846825 master-0 kubenswrapper[17644]: I0319 12:04:28.846350 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config" (OuterVolumeSpecName: "config") pod "46f265536aba6292ead501bc9b49f327" (UID: "46f265536aba6292ead501bc9b49f327"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:04:28.846825 master-0 kubenswrapper[17644]: I0319 12:04:28.846367 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets" (OuterVolumeSpecName: "secrets") pod "46f265536aba6292ead501bc9b49f327" (UID: "46f265536aba6292ead501bc9b49f327"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:04:28.949363 master-0 kubenswrapper[17644]: I0319 12:04:28.948648 17644 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Mar 19 12:04:28.949363 master-0 kubenswrapper[17644]: I0319 12:04:28.948686 17644 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:04:28.949363 master-0 kubenswrapper[17644]: I0319 12:04:28.948702 17644 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") on node \"master-0\" DevicePath \"\"" Mar 19 12:04:28.949363 master-0 kubenswrapper[17644]: I0319 12:04:28.948714 17644 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:04:28.998116 master-0 kubenswrapper[17644]: I0319 12:04:28.997864 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:04:29.020652 master-0 kubenswrapper[17644]: W0319 12:04:29.020589 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod607a35c2a34325129014a178207e606c.slice/crio-b5e7e400236405f2886d239c6b9855e16c927caf360652a9a3fc0202e1c9146b WatchSource:0}: Error finding container b5e7e400236405f2886d239c6b9855e16c927caf360652a9a3fc0202e1c9146b: Status 404 returned error can't find the container with id b5e7e400236405f2886d239c6b9855e16c927caf360652a9a3fc0202e1c9146b Mar 19 12:04:29.371450 master-0 kubenswrapper[17644]: I0319 12:04:29.371401 17644 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="fa33151970d752ef2161babaa56491652362bb6f1d5e173d5390c7f59b36f27d" exitCode=0 Mar 19 12:04:29.371450 master-0 kubenswrapper[17644]: I0319 12:04:29.371437 17644 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="6ecb192a1cfeb4529102ad33aeed1229502ac0d4a0688a01c8e90bffa6cdc39c" exitCode=0 Mar 19 12:04:29.371584 master-0 kubenswrapper[17644]: I0319 12:04:29.371571 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 12:04:29.371631 master-0 kubenswrapper[17644]: I0319 12:04:29.371558 17644 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c21d5cdcf33dc5445d398db5efae2e61668498b313fd2a8200f2011b9857d1d4" Mar 19 12:04:29.371674 master-0 kubenswrapper[17644]: I0319 12:04:29.371647 17644 scope.go:117] "RemoveContainer" containerID="b533d36029413fb01ef12a682704ad486041204246c172de95f0a4aeff2f5180" Mar 19 12:04:29.373754 master-0 kubenswrapper[17644]: I0319 12:04:29.373129 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-0" event={"ID":"dff9f91a-2293-4b2d-95dd-be0f9152984e","Type":"ContainerStarted","Data":"b04dc99fc70fd502313237dd24a3f425eae2db56e29cdb03b7711c7be66b620c"} Mar 19 12:04:29.373808 master-0 kubenswrapper[17644]: I0319 12:04:29.373766 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-0" event={"ID":"dff9f91a-2293-4b2d-95dd-be0f9152984e","Type":"ContainerStarted","Data":"9a9cc855dd35c321bb6028dac43d394ebc987b0227f410ae6aae3528faae2384"} Mar 19 12:04:29.377466 master-0 kubenswrapper[17644]: I0319 12:04:29.377433 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-5dbbb8b86f-wdwkz_89cf2ee8-3664-4502-b70c-b7e0a5e92cb7/multus-admission-controller/0.log" Mar 19 12:04:29.377556 master-0 kubenswrapper[17644]: I0319 12:04:29.377478 17644 generic.go:334] "Generic (PLEG): container finished" podID="89cf2ee8-3664-4502-b70c-b7e0a5e92cb7" containerID="fd263d596db29c9074c9bdeb64bbf7299d71e22e2b7ef560f862c8a5aa1f42ef" exitCode=137 Mar 19 12:04:29.377556 master-0 kubenswrapper[17644]: I0319 12:04:29.377534 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" event={"ID":"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7","Type":"ContainerDied","Data":"fd263d596db29c9074c9bdeb64bbf7299d71e22e2b7ef560f862c8a5aa1f42ef"} Mar 19 12:04:29.381967 master-0 kubenswrapper[17644]: I0319 12:04:29.381929 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7db659c55f-mfdrv" event={"ID":"5696b619-f43e-47a7-b557-5a6abc07cd2a","Type":"ContainerStarted","Data":"3a8deaa0c3798324d343c7c041752a4bee4b0423fa2d0ec1b457a6a15b270bb4"} Mar 19 12:04:29.381967 master-0 kubenswrapper[17644]: I0319 12:04:29.381959 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7db659c55f-mfdrv" event={"ID":"5696b619-f43e-47a7-b557-5a6abc07cd2a","Type":"ContainerStarted","Data":"eb1ac23240d2fb118e9a10c9a5060cc9fb1f8178658db8704746ed8463a63f5f"} Mar 19 12:04:29.388991 master-0 kubenswrapper[17644]: I0319 12:04:29.388942 17644 generic.go:334] "Generic (PLEG): container finished" podID="6602edde-61c4-4316-a2ca-a21c764eb590" containerID="47bdda6e9df1906851b44214a81715bce9596f58c9f5ceb95fe1c2e7f3bea6e0" exitCode=0 Mar 19 12:04:29.389163 master-0 kubenswrapper[17644]: I0319 12:04:29.389019 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"6602edde-61c4-4316-a2ca-a21c764eb590","Type":"ContainerDied","Data":"47bdda6e9df1906851b44214a81715bce9596f58c9f5ceb95fe1c2e7f3bea6e0"} Mar 19 12:04:29.390980 master-0 kubenswrapper[17644]: I0319 12:04:29.390925 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"607a35c2a34325129014a178207e606c","Type":"ContainerStarted","Data":"ea2a9b69ce6100be8b7a5ab8a5b3754a3d903dde1f34f7b2ab132e5a43190e43"} Mar 19 12:04:29.398763 master-0 kubenswrapper[17644]: I0319 12:04:29.398628 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"607a35c2a34325129014a178207e606c","Type":"ContainerStarted","Data":"b5e7e400236405f2886d239c6b9855e16c927caf360652a9a3fc0202e1c9146b"} Mar 19 12:04:29.399913 master-0 kubenswrapper[17644]: I0319 12:04:29.399839 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-6-master-0" podStartSLOduration=2.399819559 podStartE2EDuration="2.399819559s" podCreationTimestamp="2026-03-19 12:04:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:04:29.393338829 +0000 UTC m=+303.163296874" watchObservedRunningTime="2026-03-19 12:04:29.399819559 +0000 UTC m=+303.169777614" Mar 19 12:04:29.419860 master-0 kubenswrapper[17644]: I0319 12:04:29.419813 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-5dbbb8b86f-wdwkz_89cf2ee8-3664-4502-b70c-b7e0a5e92cb7/multus-admission-controller/0.log" Mar 19 12:04:29.420027 master-0 kubenswrapper[17644]: I0319 12:04:29.419888 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 12:04:29.429773 master-0 kubenswrapper[17644]: I0319 12:04:29.429682 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7db659c55f-mfdrv" podStartSLOduration=2.429664555 podStartE2EDuration="2.429664555s" podCreationTimestamp="2026-03-19 12:04:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:04:29.422110209 +0000 UTC m=+303.192068274" watchObservedRunningTime="2026-03-19 12:04:29.429664555 +0000 UTC m=+303.199622590" Mar 19 12:04:29.556508 master-0 kubenswrapper[17644]: I0319 12:04:29.556441 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs\") pod \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " Mar 19 12:04:29.556692 master-0 kubenswrapper[17644]: I0319 12:04:29.556525 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7spvn\" (UniqueName: \"kubernetes.io/projected/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-kube-api-access-7spvn\") pod \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\" (UID: \"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7\") " Mar 19 12:04:29.561288 master-0 kubenswrapper[17644]: I0319 12:04:29.561255 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "89cf2ee8-3664-4502-b70c-b7e0a5e92cb7" (UID: "89cf2ee8-3664-4502-b70c-b7e0a5e92cb7"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:04:29.561974 master-0 kubenswrapper[17644]: I0319 12:04:29.561950 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-kube-api-access-7spvn" (OuterVolumeSpecName: "kube-api-access-7spvn") pod "89cf2ee8-3664-4502-b70c-b7e0a5e92cb7" (UID: "89cf2ee8-3664-4502-b70c-b7e0a5e92cb7"). InnerVolumeSpecName "kube-api-access-7spvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:04:29.659233 master-0 kubenswrapper[17644]: I0319 12:04:29.658890 17644 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-webhook-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:04:29.659233 master-0 kubenswrapper[17644]: I0319 12:04:29.658943 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7spvn\" (UniqueName: \"kubernetes.io/projected/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7-kube-api-access-7spvn\") on node \"master-0\" DevicePath \"\"" Mar 19 12:04:30.400621 master-0 kubenswrapper[17644]: I0319 12:04:30.400470 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-5dbbb8b86f-wdwkz_89cf2ee8-3664-4502-b70c-b7e0a5e92cb7/multus-admission-controller/0.log" Mar 19 12:04:30.400621 master-0 kubenswrapper[17644]: I0319 12:04:30.400581 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" event={"ID":"89cf2ee8-3664-4502-b70c-b7e0a5e92cb7","Type":"ContainerDied","Data":"975e632bf87b61a6785fc741d9417b8abbd6243ba2abd8088f9fe581fcfef90c"} Mar 19 12:04:30.401204 master-0 kubenswrapper[17644]: I0319 12:04:30.400625 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz" Mar 19 12:04:30.401204 master-0 kubenswrapper[17644]: I0319 12:04:30.400641 17644 scope.go:117] "RemoveContainer" containerID="0afce24a5d5f93336e577364d7c0df2f3a4ed2cf2501e8357b1b537f30d7ce5e" Mar 19 12:04:30.404145 master-0 kubenswrapper[17644]: I0319 12:04:30.404079 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"607a35c2a34325129014a178207e606c","Type":"ContainerStarted","Data":"6df3457295116a2e9643f9aa93c1bc33230ddf9f1366aab4d64dcdaedbded1b4"} Mar 19 12:04:30.404228 master-0 kubenswrapper[17644]: I0319 12:04:30.404145 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"607a35c2a34325129014a178207e606c","Type":"ContainerStarted","Data":"1700c51a0be3b8389e42a5cf379351f4fa21a1a23cc74be2e934a716c3897cd0"} Mar 19 12:04:30.404228 master-0 kubenswrapper[17644]: I0319 12:04:30.404164 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"607a35c2a34325129014a178207e606c","Type":"ContainerStarted","Data":"f7c540b0641ec46c201cc061924c6ea67fd66d520e406a823de347faf358f648"} Mar 19 12:04:30.424230 master-0 kubenswrapper[17644]: I0319 12:04:30.424197 17644 scope.go:117] "RemoveContainer" containerID="fd263d596db29c9074c9bdeb64bbf7299d71e22e2b7ef560f862c8a5aa1f42ef" Mar 19 12:04:30.442362 master-0 kubenswrapper[17644]: I0319 12:04:30.442126 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=2.4421068 podStartE2EDuration="2.4421068s" podCreationTimestamp="2026-03-19 12:04:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:04:30.429833047 +0000 UTC m=+304.199791102" watchObservedRunningTime="2026-03-19 12:04:30.4421068 +0000 UTC m=+304.212064845" Mar 19 12:04:30.449964 master-0 kubenswrapper[17644]: I0319 12:04:30.449910 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz"] Mar 19 12:04:30.457979 master-0 kubenswrapper[17644]: I0319 12:04:30.457889 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-wdwkz"] Mar 19 12:04:30.498604 master-0 kubenswrapper[17644]: I0319 12:04:30.498461 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46f265536aba6292ead501bc9b49f327" path="/var/lib/kubelet/pods/46f265536aba6292ead501bc9b49f327/volumes" Mar 19 12:04:30.499123 master-0 kubenswrapper[17644]: I0319 12:04:30.499100 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89cf2ee8-3664-4502-b70c-b7e0a5e92cb7" path="/var/lib/kubelet/pods/89cf2ee8-3664-4502-b70c-b7e0a5e92cb7/volumes" Mar 19 12:04:30.499438 master-0 kubenswrapper[17644]: I0319 12:04:30.499412 17644 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="" Mar 19 12:04:30.528896 master-0 kubenswrapper[17644]: I0319 12:04:30.528838 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 19 12:04:30.528896 master-0 kubenswrapper[17644]: I0319 12:04:30.528884 17644 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="6125acf0-c600-44c7-a816-38c41fcfebee" Mar 19 12:04:30.549820 master-0 kubenswrapper[17644]: I0319 12:04:30.542576 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 19 12:04:30.549820 master-0 kubenswrapper[17644]: I0319 12:04:30.542625 17644 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="6125acf0-c600-44c7-a816-38c41fcfebee" Mar 19 12:04:30.779304 master-0 kubenswrapper[17644]: I0319 12:04:30.779227 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:04:30.779304 master-0 kubenswrapper[17644]: I0319 12:04:30.779277 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:04:30.782019 master-0 kubenswrapper[17644]: I0319 12:04:30.781959 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 12:04:30.784224 master-0 kubenswrapper[17644]: I0319 12:04:30.784181 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:04:30.885878 master-0 kubenswrapper[17644]: I0319 12:04:30.885811 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6602edde-61c4-4316-a2ca-a21c764eb590-kube-api-access\") pod \"6602edde-61c4-4316-a2ca-a21c764eb590\" (UID: \"6602edde-61c4-4316-a2ca-a21c764eb590\") " Mar 19 12:04:30.886082 master-0 kubenswrapper[17644]: I0319 12:04:30.885992 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6602edde-61c4-4316-a2ca-a21c764eb590-var-lock\") pod \"6602edde-61c4-4316-a2ca-a21c764eb590\" (UID: \"6602edde-61c4-4316-a2ca-a21c764eb590\") " Mar 19 12:04:30.886082 master-0 kubenswrapper[17644]: I0319 12:04:30.886014 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6602edde-61c4-4316-a2ca-a21c764eb590-kubelet-dir\") pod \"6602edde-61c4-4316-a2ca-a21c764eb590\" (UID: \"6602edde-61c4-4316-a2ca-a21c764eb590\") " Mar 19 12:04:30.886549 master-0 kubenswrapper[17644]: I0319 12:04:30.886327 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6602edde-61c4-4316-a2ca-a21c764eb590-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6602edde-61c4-4316-a2ca-a21c764eb590" (UID: "6602edde-61c4-4316-a2ca-a21c764eb590"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:04:30.886549 master-0 kubenswrapper[17644]: I0319 12:04:30.886327 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6602edde-61c4-4316-a2ca-a21c764eb590-var-lock" (OuterVolumeSpecName: "var-lock") pod "6602edde-61c4-4316-a2ca-a21c764eb590" (UID: "6602edde-61c4-4316-a2ca-a21c764eb590"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:04:30.887344 master-0 kubenswrapper[17644]: I0319 12:04:30.887229 17644 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6602edde-61c4-4316-a2ca-a21c764eb590-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:04:30.887344 master-0 kubenswrapper[17644]: I0319 12:04:30.887274 17644 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6602edde-61c4-4316-a2ca-a21c764eb590-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:04:30.893569 master-0 kubenswrapper[17644]: I0319 12:04:30.893506 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6602edde-61c4-4316-a2ca-a21c764eb590-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6602edde-61c4-4316-a2ca-a21c764eb590" (UID: "6602edde-61c4-4316-a2ca-a21c764eb590"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:04:30.989538 master-0 kubenswrapper[17644]: I0319 12:04:30.989418 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6602edde-61c4-4316-a2ca-a21c764eb590-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 12:04:31.425943 master-0 kubenswrapper[17644]: I0319 12:04:31.425816 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"6602edde-61c4-4316-a2ca-a21c764eb590","Type":"ContainerDied","Data":"7f8fd2be4ab4ed9016a4ef756d06935635f3093245c8c659f437f84c27edbbec"} Mar 19 12:04:31.425943 master-0 kubenswrapper[17644]: I0319 12:04:31.425863 17644 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f8fd2be4ab4ed9016a4ef756d06935635f3093245c8c659f437f84c27edbbec" Mar 19 12:04:31.426615 master-0 kubenswrapper[17644]: I0319 12:04:31.426590 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 12:04:31.430772 master-0 kubenswrapper[17644]: I0319 12:04:31.430744 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:04:38.204223 master-0 kubenswrapper[17644]: I0319 12:04:38.204148 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:38.204815 master-0 kubenswrapper[17644]: I0319 12:04:38.204271 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:38.217336 master-0 kubenswrapper[17644]: I0319 12:04:38.217225 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:38.480458 master-0 kubenswrapper[17644]: I0319 12:04:38.480345 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:04:38.998693 master-0 kubenswrapper[17644]: I0319 12:04:38.998611 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:04:38.998989 master-0 kubenswrapper[17644]: I0319 12:04:38.998856 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:04:38.999183 master-0 kubenswrapper[17644]: I0319 12:04:38.999158 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:04:38.999183 master-0 kubenswrapper[17644]: I0319 12:04:38.999178 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:04:38.999267 master-0 kubenswrapper[17644]: I0319 12:04:38.999236 17644 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 19 12:04:38.999325 master-0 kubenswrapper[17644]: I0319 12:04:38.999284 17644 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="607a35c2a34325129014a178207e606c" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 19 12:04:39.003504 master-0 kubenswrapper[17644]: I0319 12:04:39.003434 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:04:39.502011 master-0 kubenswrapper[17644]: I0319 12:04:39.501876 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:04:48.999638 master-0 kubenswrapper[17644]: I0319 12:04:48.999569 17644 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 19 12:04:49.000462 master-0 kubenswrapper[17644]: I0319 12:04:48.999669 17644 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="607a35c2a34325129014a178207e606c" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 19 12:04:52.554598 master-0 kubenswrapper[17644]: I0319 12:04:52.554501 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:04:52.599278 master-0 kubenswrapper[17644]: I0319 12:04:52.599236 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:04:52.664794 master-0 kubenswrapper[17644]: I0319 12:04:52.664630 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:04:54.143792 master-0 kubenswrapper[17644]: I0319 12:04:54.143691 17644 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.143998 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcdctl" containerID="cri-o://06671c3758623dfa519c5bba4e475806636df7ef1dd7182a02cae6c91baa2e46" gracePeriod=30 Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.144064 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd" containerID="cri-o://0250f49b8d891954793ad552b261b0ce750c83c05e6b10b449eb9f6c02bf16f9" gracePeriod=30 Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.144044 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-rev" containerID="cri-o://c2bb91041db17b87be85528b31c2480989756f9c7e485dd2cc9a4b6bbe2f021b" gracePeriod=30 Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.144044 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-metrics" containerID="cri-o://4aeac864a0b8c139910b7cc56c7cd968bf6d24973d0e32d571eccc06d033d0f5" gracePeriod=30 Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.144129 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-readyz" containerID="cri-o://b342656179e33f18902581a908ce540ce6ef0dc91604b6d131a3f77e2a7348cf" gracePeriod=30 Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.145348 17644 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: E0319 12:04:54.145618 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-readyz" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.145630 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-readyz" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: E0319 12:04:54.145647 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcdctl" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.145653 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcdctl" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: E0319 12:04:54.145668 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6602edde-61c4-4316-a2ca-a21c764eb590" containerName="installer" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.145674 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="6602edde-61c4-4316-a2ca-a21c764eb590" containerName="installer" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: E0319 12:04:54.145681 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="setup" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.145688 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="setup" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: E0319 12:04:54.145696 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.145701 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: E0319 12:04:54.145709 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-resources-copy" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.145714 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-resources-copy" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: E0319 12:04:54.145799 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-ensure-env-vars" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.145809 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-ensure-env-vars" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: E0319 12:04:54.145822 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89cf2ee8-3664-4502-b70c-b7e0a5e92cb7" containerName="multus-admission-controller" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.145828 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="89cf2ee8-3664-4502-b70c-b7e0a5e92cb7" containerName="multus-admission-controller" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: E0319 12:04:54.145836 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89cf2ee8-3664-4502-b70c-b7e0a5e92cb7" containerName="kube-rbac-proxy" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.145842 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="89cf2ee8-3664-4502-b70c-b7e0a5e92cb7" containerName="kube-rbac-proxy" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: E0319 12:04:54.145854 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-rev" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.145860 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-rev" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: E0319 12:04:54.145874 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-metrics" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.145880 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-metrics" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.146002 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-metrics" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.146027 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.146037 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-readyz" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.146046 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-resources-copy" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.146055 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="89cf2ee8-3664-4502-b70c-b7e0a5e92cb7" containerName="multus-admission-controller" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.146066 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="6602edde-61c4-4316-a2ca-a21c764eb590" containerName="installer" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.146077 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-rev" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.146085 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="89cf2ee8-3664-4502-b70c-b7e0a5e92cb7" containerName="kube-rbac-proxy" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.146092 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="setup" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.146103 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcdctl" Mar 19 12:04:54.146516 master-0 kubenswrapper[17644]: I0319 12:04:54.146112 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-ensure-env-vars" Mar 19 12:04:54.238900 master-0 kubenswrapper[17644]: I0319 12:04:54.238851 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-data-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:04:54.238900 master-0 kubenswrapper[17644]: I0319 12:04:54.238903 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-log-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:04:54.239161 master-0 kubenswrapper[17644]: I0319 12:04:54.238922 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:04:54.239161 master-0 kubenswrapper[17644]: I0319 12:04:54.238938 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-resource-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:04:54.239161 master-0 kubenswrapper[17644]: I0319 12:04:54.238981 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:04:54.239161 master-0 kubenswrapper[17644]: I0319 12:04:54.239095 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-cert-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:04:54.341551 master-0 kubenswrapper[17644]: I0319 12:04:54.341168 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:04:54.341551 master-0 kubenswrapper[17644]: I0319 12:04:54.341272 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-cert-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:04:54.341551 master-0 kubenswrapper[17644]: I0319 12:04:54.341356 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-data-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:04:54.341551 master-0 kubenswrapper[17644]: I0319 12:04:54.341378 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-log-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:04:54.341551 master-0 kubenswrapper[17644]: I0319 12:04:54.341396 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:04:54.341551 master-0 kubenswrapper[17644]: I0319 12:04:54.341413 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-resource-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:04:54.341551 master-0 kubenswrapper[17644]: I0319 12:04:54.341460 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-cert-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:04:54.341551 master-0 kubenswrapper[17644]: I0319 12:04:54.341482 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-resource-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:04:54.341551 master-0 kubenswrapper[17644]: I0319 12:04:54.341518 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-log-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:04:54.341551 master-0 kubenswrapper[17644]: I0319 12:04:54.341531 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:04:54.341551 master-0 kubenswrapper[17644]: I0319 12:04:54.341553 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:04:54.341551 master-0 kubenswrapper[17644]: I0319 12:04:54.341567 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-data-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:04:57.668240 master-0 kubenswrapper[17644]: I0319 12:04:57.668101 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-rev/0.log" Mar 19 12:04:57.670095 master-0 kubenswrapper[17644]: I0319 12:04:57.670015 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-metrics/0.log" Mar 19 12:04:57.675870 master-0 kubenswrapper[17644]: I0319 12:04:57.675827 17644 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="c2bb91041db17b87be85528b31c2480989756f9c7e485dd2cc9a4b6bbe2f021b" exitCode=2 Mar 19 12:04:57.675870 master-0 kubenswrapper[17644]: I0319 12:04:57.675860 17644 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="b342656179e33f18902581a908ce540ce6ef0dc91604b6d131a3f77e2a7348cf" exitCode=0 Mar 19 12:04:57.675870 master-0 kubenswrapper[17644]: I0319 12:04:57.675872 17644 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="4aeac864a0b8c139910b7cc56c7cd968bf6d24973d0e32d571eccc06d033d0f5" exitCode=2 Mar 19 12:04:57.678649 master-0 kubenswrapper[17644]: I0319 12:04:57.678570 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-66b8ffb895-vqnnc" event={"ID":"e17d22fe-fe0f-448e-9666-882d888d3ad4","Type":"ContainerStarted","Data":"764464d636a3c9a85000a9bda52588bbb71166fb8034847d547d99214dbd7561"} Mar 19 12:04:57.679636 master-0 kubenswrapper[17644]: I0319 12:04:57.679587 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-66b8ffb895-vqnnc" Mar 19 12:04:57.681661 master-0 kubenswrapper[17644]: I0319 12:04:57.681523 17644 patch_prober.go:28] interesting pod/downloads-66b8ffb895-vqnnc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.95:8080/\": dial tcp 10.128.0.95:8080: connect: connection refused" start-of-body= Mar 19 12:04:57.681769 master-0 kubenswrapper[17644]: I0319 12:04:57.681680 17644 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-66b8ffb895-vqnnc" podUID="e17d22fe-fe0f-448e-9666-882d888d3ad4" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.95:8080/\": dial tcp 10.128.0.95:8080: connect: connection refused" Mar 19 12:04:58.685209 master-0 kubenswrapper[17644]: I0319 12:04:58.685143 17644 patch_prober.go:28] interesting pod/downloads-66b8ffb895-vqnnc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.95:8080/\": dial tcp 10.128.0.95:8080: connect: connection refused" start-of-body= Mar 19 12:04:58.685674 master-0 kubenswrapper[17644]: I0319 12:04:58.685241 17644 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-66b8ffb895-vqnnc" podUID="e17d22fe-fe0f-448e-9666-882d888d3ad4" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.95:8080/\": dial tcp 10.128.0.95:8080: connect: connection refused" Mar 19 12:04:58.999132 master-0 kubenswrapper[17644]: I0319 12:04:58.999003 17644 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 19 12:04:58.999132 master-0 kubenswrapper[17644]: I0319 12:04:58.999074 17644 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="607a35c2a34325129014a178207e606c" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 19 12:04:58.999132 master-0 kubenswrapper[17644]: I0319 12:04:58.999127 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:04:59.001603 master-0 kubenswrapper[17644]: I0319 12:04:58.999790 17644 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"ea2a9b69ce6100be8b7a5ab8a5b3754a3d903dde1f34f7b2ab132e5a43190e43"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 19 12:04:59.001603 master-0 kubenswrapper[17644]: I0319 12:04:58.999919 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="607a35c2a34325129014a178207e606c" containerName="kube-controller-manager" containerID="cri-o://ea2a9b69ce6100be8b7a5ab8a5b3754a3d903dde1f34f7b2ab132e5a43190e43" gracePeriod=30 Mar 19 12:05:01.952375 master-0 kubenswrapper[17644]: I0319 12:05:01.952304 17644 patch_prober.go:28] interesting pod/downloads-66b8ffb895-vqnnc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.95:8080/\": dial tcp 10.128.0.95:8080: connect: connection refused" start-of-body= Mar 19 12:05:01.952375 master-0 kubenswrapper[17644]: I0319 12:05:01.952363 17644 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-66b8ffb895-vqnnc" podUID="e17d22fe-fe0f-448e-9666-882d888d3ad4" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.95:8080/\": dial tcp 10.128.0.95:8080: connect: connection refused" Mar 19 12:05:01.953063 master-0 kubenswrapper[17644]: I0319 12:05:01.952426 17644 patch_prober.go:28] interesting pod/downloads-66b8ffb895-vqnnc container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.128.0.95:8080/\": dial tcp 10.128.0.95:8080: connect: connection refused" start-of-body= Mar 19 12:05:01.953063 master-0 kubenswrapper[17644]: I0319 12:05:01.952445 17644 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-66b8ffb895-vqnnc" podUID="e17d22fe-fe0f-448e-9666-882d888d3ad4" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.95:8080/\": dial tcp 10.128.0.95:8080: connect: connection refused" Mar 19 12:05:08.746354 master-0 kubenswrapper[17644]: I0319 12:05:08.746288 17644 generic.go:334] "Generic (PLEG): container finished" podID="4ec000c4-5cc8-45b3-95ba-2856655f02f5" containerID="2954bb26fffbeed69600aa01dfdf7f588c1039c56fa3503890cba6c8da239e98" exitCode=0 Mar 19 12:05:08.746354 master-0 kubenswrapper[17644]: I0319 12:05:08.746342 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"4ec000c4-5cc8-45b3-95ba-2856655f02f5","Type":"ContainerDied","Data":"2954bb26fffbeed69600aa01dfdf7f588c1039c56fa3503890cba6c8da239e98"} Mar 19 12:05:10.115246 master-0 kubenswrapper[17644]: I0319 12:05:10.115180 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 19 12:05:10.288509 master-0 kubenswrapper[17644]: I0319 12:05:10.288212 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ec000c4-5cc8-45b3-95ba-2856655f02f5-kube-api-access\") pod \"4ec000c4-5cc8-45b3-95ba-2856655f02f5\" (UID: \"4ec000c4-5cc8-45b3-95ba-2856655f02f5\") " Mar 19 12:05:10.288509 master-0 kubenswrapper[17644]: I0319 12:05:10.288269 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ec000c4-5cc8-45b3-95ba-2856655f02f5-kubelet-dir\") pod \"4ec000c4-5cc8-45b3-95ba-2856655f02f5\" (UID: \"4ec000c4-5cc8-45b3-95ba-2856655f02f5\") " Mar 19 12:05:10.288509 master-0 kubenswrapper[17644]: I0319 12:05:10.288342 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4ec000c4-5cc8-45b3-95ba-2856655f02f5-var-lock\") pod \"4ec000c4-5cc8-45b3-95ba-2856655f02f5\" (UID: \"4ec000c4-5cc8-45b3-95ba-2856655f02f5\") " Mar 19 12:05:10.288509 master-0 kubenswrapper[17644]: I0319 12:05:10.288455 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ec000c4-5cc8-45b3-95ba-2856655f02f5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4ec000c4-5cc8-45b3-95ba-2856655f02f5" (UID: "4ec000c4-5cc8-45b3-95ba-2856655f02f5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:05:10.288998 master-0 kubenswrapper[17644]: I0319 12:05:10.288545 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ec000c4-5cc8-45b3-95ba-2856655f02f5-var-lock" (OuterVolumeSpecName: "var-lock") pod "4ec000c4-5cc8-45b3-95ba-2856655f02f5" (UID: "4ec000c4-5cc8-45b3-95ba-2856655f02f5"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:05:10.288998 master-0 kubenswrapper[17644]: I0319 12:05:10.288808 17644 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ec000c4-5cc8-45b3-95ba-2856655f02f5-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:05:10.288998 master-0 kubenswrapper[17644]: I0319 12:05:10.288821 17644 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4ec000c4-5cc8-45b3-95ba-2856655f02f5-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:05:10.291795 master-0 kubenswrapper[17644]: I0319 12:05:10.291214 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ec000c4-5cc8-45b3-95ba-2856655f02f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4ec000c4-5cc8-45b3-95ba-2856655f02f5" (UID: "4ec000c4-5cc8-45b3-95ba-2856655f02f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:05:10.390547 master-0 kubenswrapper[17644]: I0319 12:05:10.390428 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ec000c4-5cc8-45b3-95ba-2856655f02f5-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 12:05:10.761666 master-0 kubenswrapper[17644]: I0319 12:05:10.761589 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"4ec000c4-5cc8-45b3-95ba-2856655f02f5","Type":"ContainerDied","Data":"5c721c029071b39ac77069956f4b8a9993362193cc4ff7085dec746b764d225e"} Mar 19 12:05:10.761666 master-0 kubenswrapper[17644]: I0319 12:05:10.761641 17644 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c721c029071b39ac77069956f4b8a9993362193cc4ff7085dec746b764d225e" Mar 19 12:05:10.762046 master-0 kubenswrapper[17644]: I0319 12:05:10.761702 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 19 12:05:10.879131 master-0 kubenswrapper[17644]: E0319 12:05:10.879054 17644 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" Mar 19 12:05:11.958121 master-0 kubenswrapper[17644]: I0319 12:05:11.957965 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-66b8ffb895-vqnnc" Mar 19 12:05:14.464569 master-0 kubenswrapper[17644]: E0319 12:05:14.464296 17644 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:05:04Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:05:04Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:05:04Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:05:04Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ddc5283caf2ced75a94ddf0e8a43c431889692007e8a875a187b25c35b45a9e2\\\"],\\\"sizeBytes\\\":2895807090},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:3ea089ab116e164d89b46dc077f87d9af22f525bc2d69403214f77ee3fd30161\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:d9cbffb5a2fd538c8f19b7174d2906286acdb37a574b9dce3f9da302074591ff\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746416849},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\\\"],\\\"sizeBytes\\\":1637455533},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:c9f7bbe4799eaacbfbb60eb906000d7a813a580d6a9740def7da774cbc4cf859\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cde1da53dadc54c24c10cab8fd3e67839ce68c33ec3b556c255a79167881966a\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252053726},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a55ec7ec64efd0f595d8084377b7e463a1807829b7617e5d4a9092dcd924c36\\\"],\\\"sizeBytes\\\":1238100502},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:aefc421cf2f5dba925f7c149d56ce14e910fbd969a4e22b5917fc912ca33a5b2\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:da1ee8c9ae2cb275833f329b3d793a9109915be16d938f208ec917b50d9dd66a\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1223644894},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:af0fe0ca926422a6471d5bf22fc0e682c36c24fba05496a3bdfac0b7d3733015\\\"],\\\"sizeBytes\\\":991832673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\"],\\\"sizeBytes\\\":943841779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b00c42562d477ef44d51f35950253a26d7debc7de86e53270831aafef5795c1\\\"],\\\"sizeBytes\\\":918289953},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f2c59d19eb73ad5c0f93b0a63003c1885f5297959c9c45b401d1a74aea6e76\\\"],\\\"sizeBytes\\\":880382887},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0a09f5a3ba4f60cce0145769509bab92553c8075d210af4ac058965d2ae11efa\\\"],\\\"sizeBytes\\\":876160834},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6bba3f73c0066e42b24839e0d29f5dce2f36436f0a11f9f5e1029bccc5ed6578\\\"],\\\"sizeBytes\\\":862657321},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:de91abd5ad76fb491881a75a0feb4b8ca5600ceb5e15a4b0b687ada01ea0a44c\\\"],\\\"sizeBytes\\\":862205633},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a9e8da5c6114f062b814936d4db7a47a04d248e160d6bb28ad4e4a081496ee4\\\"],\\\"sizeBytes\\\":772943435},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1e1faad2d9167d84e23585c1cea5962301845548043cf09578f943f79ca98016\\\"],\\\"sizeBytes\\\":687949580},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5e12e4dc52214d3ada5ba5106caebe079eac1d9292c2571a5fe83411ce8e900d\\\"],\\\"sizeBytes\\\":683195416},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa5e782406f71c048b1ac3a4bf5d1227ff4be81111114083ad4c7a209c6bfb5a\\\"],\\\"sizeBytes\\\":677942383},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5bbb8535e2496de8389585ebbe696e7d7b9bad2b27785ad8a30a0fc683b0a22d\\\"],\\\"sizeBytes\\\":633877280},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ec8fd46dfb35ed10e8f98933166f69ce579c2f35b8db03d21e4c34fc544553e4\\\"],\\\"sizeBytes\\\":621648710},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f3038df8df25746bb5095296d4e5740f2356f85c1ed8d43f1b3d281e294826e5\\\"],\\\"sizeBytes\\\":605698193},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998\\\"],\\\"sizeBytes\\\":589386806},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55\\\"],\\\"sizeBytes\\\":582154903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982\\\"],\\\"sizeBytes\\\":558211175},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:112a03f2411f871cdaca5f20daef71024dac710113d5f30897117a5a02f6b6f5\\\"],\\\"sizeBytes\\\":557428271},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"],\\\"sizeBytes\\\":548752816},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\"],\\\"sizeBytes\\\":529326739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b4c9cf268bb7abef7af187cd775d3f74d0bd33626250095428d53b705ee946\\\"],\\\"sizeBytes\\\":528956487},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\"],\\\"sizeBytes\\\":518384969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:59727c4b3fef19e5149675cf3350735bbfe2c6588a57654b2e4552dd719f58b1\\\"],\\\"sizeBytes\\\":517999161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\"],\\\"sizeBytes\\\":514984269},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:30a2f97d7785ce8b0ea5115e67c4554b64adefbc7856bcf6f4fe6cc7e938a310\\\"],\\\"sizeBytes\\\":513582374},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bfe394b58ec6195de8b8420e781b7630d85a412b9112d892fea903f92b783427\\\"],\\\"sizeBytes\\\":513221333},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"],\\\"sizeBytes\\\":512274055},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98bf5467a01195e20aeea7d6f0b130ddacc00b73bc5312253b8c34e7208538f8\\\"],\\\"sizeBytes\\\":512235769},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:77fff570657d2fa0bfb709b2c8b6665bae0bf90a2be981d8dbca56c674715098\\\"],\\\"sizeBytes\\\":511227324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adb9f6f2fd701863c7caed747df43f83d3569ba9388cfa33ea7219ac6a606b11\\\"],\\\"sizeBytes\\\":511164375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458\\\"],\\\"sizeBytes\\\":508888171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263\\\"],\\\"sizeBytes\\\":508544745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71\\\"],\\\"sizeBytes\\\":507972093},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3\\\"],\\\"sizeBytes\\\":506480167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\"],\\\"sizeBytes\\\":506395599},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7d4a034950346bcd4e36e9e2f1343e0cf7a10cf544963f33d09c7eb2a1bfc634\\\"],\\\"sizeBytes\\\":505345991},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\\\"],\\\"sizeBytes\\\":505246690},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1973d56a1097a48ea0ebf2c4dbae1ed86fa67bb0116f4962f7720d48aa337d27\\\"],\\\"sizeBytes\\\":504662731},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\\\"],\\\"sizeBytes\\\":504625081},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf72297fee61ec9950f6868881ad3e84be8692ca08f084b3d155d93a766c0823\\\"],\\\"sizeBytes\\\":502712961},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69\\\"],\\\"sizeBytes\\\":495994673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023\\\"],\\\"sizeBytes\\\":495065340},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:002dfb86e17ad8f5cc232a7d2dce183b23335c8ecb7e7d31dcf3e4446b390777\\\"],\\\"sizeBytes\\\":487159945}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:05:14.800676 master-0 kubenswrapper[17644]: I0319 12:05:14.799939 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-6-master-0_dff9f91a-2293-4b2d-95dd-be0f9152984e/installer/0.log" Mar 19 12:05:14.800676 master-0 kubenswrapper[17644]: I0319 12:05:14.800012 17644 generic.go:334] "Generic (PLEG): container finished" podID="dff9f91a-2293-4b2d-95dd-be0f9152984e" containerID="b04dc99fc70fd502313237dd24a3f425eae2db56e29cdb03b7711c7be66b620c" exitCode=1 Mar 19 12:05:14.800676 master-0 kubenswrapper[17644]: I0319 12:05:14.800050 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-0" event={"ID":"dff9f91a-2293-4b2d-95dd-be0f9152984e","Type":"ContainerDied","Data":"b04dc99fc70fd502313237dd24a3f425eae2db56e29cdb03b7711c7be66b620c"} Mar 19 12:05:16.081838 master-0 kubenswrapper[17644]: I0319 12:05:16.081696 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-6-master-0_dff9f91a-2293-4b2d-95dd-be0f9152984e/installer/0.log" Mar 19 12:05:16.082527 master-0 kubenswrapper[17644]: I0319 12:05:16.082508 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:05:16.109681 master-0 kubenswrapper[17644]: I0319 12:05:16.109629 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dff9f91a-2293-4b2d-95dd-be0f9152984e-kube-api-access\") pod \"dff9f91a-2293-4b2d-95dd-be0f9152984e\" (UID: \"dff9f91a-2293-4b2d-95dd-be0f9152984e\") " Mar 19 12:05:16.110091 master-0 kubenswrapper[17644]: I0319 12:05:16.110052 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dff9f91a-2293-4b2d-95dd-be0f9152984e-kubelet-dir\") pod \"dff9f91a-2293-4b2d-95dd-be0f9152984e\" (UID: \"dff9f91a-2293-4b2d-95dd-be0f9152984e\") " Mar 19 12:05:16.110276 master-0 kubenswrapper[17644]: I0319 12:05:16.110150 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dff9f91a-2293-4b2d-95dd-be0f9152984e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dff9f91a-2293-4b2d-95dd-be0f9152984e" (UID: "dff9f91a-2293-4b2d-95dd-be0f9152984e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:05:16.110677 master-0 kubenswrapper[17644]: I0319 12:05:16.110647 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dff9f91a-2293-4b2d-95dd-be0f9152984e-var-lock\") pod \"dff9f91a-2293-4b2d-95dd-be0f9152984e\" (UID: \"dff9f91a-2293-4b2d-95dd-be0f9152984e\") " Mar 19 12:05:16.110841 master-0 kubenswrapper[17644]: I0319 12:05:16.110774 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dff9f91a-2293-4b2d-95dd-be0f9152984e-var-lock" (OuterVolumeSpecName: "var-lock") pod "dff9f91a-2293-4b2d-95dd-be0f9152984e" (UID: "dff9f91a-2293-4b2d-95dd-be0f9152984e"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:05:16.111366 master-0 kubenswrapper[17644]: I0319 12:05:16.111333 17644 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dff9f91a-2293-4b2d-95dd-be0f9152984e-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:05:16.111415 master-0 kubenswrapper[17644]: I0319 12:05:16.111364 17644 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dff9f91a-2293-4b2d-95dd-be0f9152984e-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:05:16.112905 master-0 kubenswrapper[17644]: I0319 12:05:16.112841 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dff9f91a-2293-4b2d-95dd-be0f9152984e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dff9f91a-2293-4b2d-95dd-be0f9152984e" (UID: "dff9f91a-2293-4b2d-95dd-be0f9152984e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:05:16.213379 master-0 kubenswrapper[17644]: I0319 12:05:16.213310 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dff9f91a-2293-4b2d-95dd-be0f9152984e-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 12:05:16.824484 master-0 kubenswrapper[17644]: I0319 12:05:16.823671 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-6-master-0_dff9f91a-2293-4b2d-95dd-be0f9152984e/installer/0.log" Mar 19 12:05:16.824484 master-0 kubenswrapper[17644]: I0319 12:05:16.823758 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-0" event={"ID":"dff9f91a-2293-4b2d-95dd-be0f9152984e","Type":"ContainerDied","Data":"9a9cc855dd35c321bb6028dac43d394ebc987b0227f410ae6aae3528faae2384"} Mar 19 12:05:16.824484 master-0 kubenswrapper[17644]: I0319 12:05:16.823795 17644 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a9cc855dd35c321bb6028dac43d394ebc987b0227f410ae6aae3528faae2384" Mar 19 12:05:16.824484 master-0 kubenswrapper[17644]: I0319 12:05:16.823832 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:05:20.880254 master-0 kubenswrapper[17644]: E0319 12:05:20.880176 17644 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:05:24.301139 master-0 kubenswrapper[17644]: I0319 12:05:24.301083 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-rev/0.log" Mar 19 12:05:24.301846 master-0 kubenswrapper[17644]: I0319 12:05:24.301815 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-metrics/0.log" Mar 19 12:05:24.302304 master-0 kubenswrapper[17644]: I0319 12:05:24.302273 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd/0.log" Mar 19 12:05:24.302624 master-0 kubenswrapper[17644]: I0319 12:05:24.302597 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcdctl/0.log" Mar 19 12:05:24.303485 master-0 kubenswrapper[17644]: I0319 12:05:24.303454 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 19 12:05:24.377494 master-0 kubenswrapper[17644]: I0319 12:05:24.377438 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 19 12:05:24.377494 master-0 kubenswrapper[17644]: I0319 12:05:24.377488 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 19 12:05:24.377769 master-0 kubenswrapper[17644]: I0319 12:05:24.377549 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 19 12:05:24.377769 master-0 kubenswrapper[17644]: I0319 12:05:24.377589 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin" (OuterVolumeSpecName: "usr-local-bin") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "usr-local-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:05:24.377769 master-0 kubenswrapper[17644]: I0319 12:05:24.377602 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 19 12:05:24.377769 master-0 kubenswrapper[17644]: I0319 12:05:24.377617 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir" (OuterVolumeSpecName: "data-dir") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:05:24.377769 master-0 kubenswrapper[17644]: I0319 12:05:24.377591 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir" (OuterVolumeSpecName: "log-dir") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:05:24.377769 master-0 kubenswrapper[17644]: I0319 12:05:24.377650 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 19 12:05:24.377769 master-0 kubenswrapper[17644]: I0319 12:05:24.377675 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 19 12:05:24.377769 master-0 kubenswrapper[17644]: I0319 12:05:24.377703 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:05:24.377769 master-0 kubenswrapper[17644]: I0319 12:05:24.377722 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:05:24.377769 master-0 kubenswrapper[17644]: I0319 12:05:24.377762 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir" (OuterVolumeSpecName: "static-pod-dir") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "static-pod-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:05:24.378249 master-0 kubenswrapper[17644]: I0319 12:05:24.378220 17644 reconciler_common.go:293] "Volume detached for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") on node \"master-0\" DevicePath \"\"" Mar 19 12:05:24.378249 master-0 kubenswrapper[17644]: I0319 12:05:24.378243 17644 reconciler_common.go:293] "Volume detached for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:05:24.378249 master-0 kubenswrapper[17644]: I0319 12:05:24.378251 17644 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:05:24.378348 master-0 kubenswrapper[17644]: I0319 12:05:24.378260 17644 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:05:24.378348 master-0 kubenswrapper[17644]: I0319 12:05:24.378268 17644 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:05:24.378348 master-0 kubenswrapper[17644]: I0319 12:05:24.378276 17644 reconciler_common.go:293] "Volume detached for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:05:24.465994 master-0 kubenswrapper[17644]: E0319 12:05:24.465928 17644 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": the server was unable to return a response in the time allotted, but may still be processing the request (get nodes master-0)" Mar 19 12:05:24.491502 master-0 kubenswrapper[17644]: I0319 12:05:24.491436 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24b4ed170d527099878cb5fdd508a2fb" path="/var/lib/kubelet/pods/24b4ed170d527099878cb5fdd508a2fb/volumes" Mar 19 12:05:24.890193 master-0 kubenswrapper[17644]: I0319 12:05:24.890051 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-rev/0.log" Mar 19 12:05:24.891238 master-0 kubenswrapper[17644]: I0319 12:05:24.891215 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-metrics/0.log" Mar 19 12:05:24.892236 master-0 kubenswrapper[17644]: I0319 12:05:24.892190 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd/0.log" Mar 19 12:05:24.892577 master-0 kubenswrapper[17644]: I0319 12:05:24.892549 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcdctl/0.log" Mar 19 12:05:24.893589 master-0 kubenswrapper[17644]: I0319 12:05:24.893560 17644 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="0250f49b8d891954793ad552b261b0ce750c83c05e6b10b449eb9f6c02bf16f9" exitCode=137 Mar 19 12:05:24.893589 master-0 kubenswrapper[17644]: I0319 12:05:24.893586 17644 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="06671c3758623dfa519c5bba4e475806636df7ef1dd7182a02cae6c91baa2e46" exitCode=137 Mar 19 12:05:24.893691 master-0 kubenswrapper[17644]: I0319 12:05:24.893633 17644 scope.go:117] "RemoveContainer" containerID="c2bb91041db17b87be85528b31c2480989756f9c7e485dd2cc9a4b6bbe2f021b" Mar 19 12:05:24.893752 master-0 kubenswrapper[17644]: I0319 12:05:24.893702 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 19 12:05:24.908296 master-0 kubenswrapper[17644]: I0319 12:05:24.908245 17644 scope.go:117] "RemoveContainer" containerID="b342656179e33f18902581a908ce540ce6ef0dc91604b6d131a3f77e2a7348cf" Mar 19 12:05:24.922179 master-0 kubenswrapper[17644]: I0319 12:05:24.922101 17644 scope.go:117] "RemoveContainer" containerID="4aeac864a0b8c139910b7cc56c7cd968bf6d24973d0e32d571eccc06d033d0f5" Mar 19 12:05:24.935774 master-0 kubenswrapper[17644]: I0319 12:05:24.935711 17644 scope.go:117] "RemoveContainer" containerID="0250f49b8d891954793ad552b261b0ce750c83c05e6b10b449eb9f6c02bf16f9" Mar 19 12:05:24.954419 master-0 kubenswrapper[17644]: I0319 12:05:24.954367 17644 scope.go:117] "RemoveContainer" containerID="06671c3758623dfa519c5bba4e475806636df7ef1dd7182a02cae6c91baa2e46" Mar 19 12:05:24.968035 master-0 kubenswrapper[17644]: I0319 12:05:24.967986 17644 scope.go:117] "RemoveContainer" containerID="b76fef7000b310af498f0cffcb969b0c47b51465c0a751707ee0c2ff2e63eba3" Mar 19 12:05:24.980846 master-0 kubenswrapper[17644]: I0319 12:05:24.980808 17644 scope.go:117] "RemoveContainer" containerID="a9417d06413f157e4d35a2d3d830254ff285bb6abccccf700d17496320ba4ec0" Mar 19 12:05:24.994172 master-0 kubenswrapper[17644]: I0319 12:05:24.994124 17644 scope.go:117] "RemoveContainer" containerID="8abd4bc13ae0709fc6342131dbb0dfd5a762a5ca0945cd22f3346298ea10ec64" Mar 19 12:05:25.014346 master-0 kubenswrapper[17644]: I0319 12:05:25.014291 17644 scope.go:117] "RemoveContainer" containerID="c2bb91041db17b87be85528b31c2480989756f9c7e485dd2cc9a4b6bbe2f021b" Mar 19 12:05:25.014861 master-0 kubenswrapper[17644]: E0319 12:05:25.014824 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2bb91041db17b87be85528b31c2480989756f9c7e485dd2cc9a4b6bbe2f021b\": container with ID starting with c2bb91041db17b87be85528b31c2480989756f9c7e485dd2cc9a4b6bbe2f021b not found: ID does not exist" containerID="c2bb91041db17b87be85528b31c2480989756f9c7e485dd2cc9a4b6bbe2f021b" Mar 19 12:05:25.014932 master-0 kubenswrapper[17644]: I0319 12:05:25.014862 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2bb91041db17b87be85528b31c2480989756f9c7e485dd2cc9a4b6bbe2f021b"} err="failed to get container status \"c2bb91041db17b87be85528b31c2480989756f9c7e485dd2cc9a4b6bbe2f021b\": rpc error: code = NotFound desc = could not find container \"c2bb91041db17b87be85528b31c2480989756f9c7e485dd2cc9a4b6bbe2f021b\": container with ID starting with c2bb91041db17b87be85528b31c2480989756f9c7e485dd2cc9a4b6bbe2f021b not found: ID does not exist" Mar 19 12:05:25.014932 master-0 kubenswrapper[17644]: I0319 12:05:25.014886 17644 scope.go:117] "RemoveContainer" containerID="b342656179e33f18902581a908ce540ce6ef0dc91604b6d131a3f77e2a7348cf" Mar 19 12:05:25.015437 master-0 kubenswrapper[17644]: E0319 12:05:25.015394 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b342656179e33f18902581a908ce540ce6ef0dc91604b6d131a3f77e2a7348cf\": container with ID starting with b342656179e33f18902581a908ce540ce6ef0dc91604b6d131a3f77e2a7348cf not found: ID does not exist" containerID="b342656179e33f18902581a908ce540ce6ef0dc91604b6d131a3f77e2a7348cf" Mar 19 12:05:25.015489 master-0 kubenswrapper[17644]: I0319 12:05:25.015456 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b342656179e33f18902581a908ce540ce6ef0dc91604b6d131a3f77e2a7348cf"} err="failed to get container status \"b342656179e33f18902581a908ce540ce6ef0dc91604b6d131a3f77e2a7348cf\": rpc error: code = NotFound desc = could not find container \"b342656179e33f18902581a908ce540ce6ef0dc91604b6d131a3f77e2a7348cf\": container with ID starting with b342656179e33f18902581a908ce540ce6ef0dc91604b6d131a3f77e2a7348cf not found: ID does not exist" Mar 19 12:05:25.015528 master-0 kubenswrapper[17644]: I0319 12:05:25.015498 17644 scope.go:117] "RemoveContainer" containerID="4aeac864a0b8c139910b7cc56c7cd968bf6d24973d0e32d571eccc06d033d0f5" Mar 19 12:05:25.015983 master-0 kubenswrapper[17644]: E0319 12:05:25.015956 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4aeac864a0b8c139910b7cc56c7cd968bf6d24973d0e32d571eccc06d033d0f5\": container with ID starting with 4aeac864a0b8c139910b7cc56c7cd968bf6d24973d0e32d571eccc06d033d0f5 not found: ID does not exist" containerID="4aeac864a0b8c139910b7cc56c7cd968bf6d24973d0e32d571eccc06d033d0f5" Mar 19 12:05:25.016046 master-0 kubenswrapper[17644]: I0319 12:05:25.015988 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4aeac864a0b8c139910b7cc56c7cd968bf6d24973d0e32d571eccc06d033d0f5"} err="failed to get container status \"4aeac864a0b8c139910b7cc56c7cd968bf6d24973d0e32d571eccc06d033d0f5\": rpc error: code = NotFound desc = could not find container \"4aeac864a0b8c139910b7cc56c7cd968bf6d24973d0e32d571eccc06d033d0f5\": container with ID starting with 4aeac864a0b8c139910b7cc56c7cd968bf6d24973d0e32d571eccc06d033d0f5 not found: ID does not exist" Mar 19 12:05:25.016046 master-0 kubenswrapper[17644]: I0319 12:05:25.016015 17644 scope.go:117] "RemoveContainer" containerID="0250f49b8d891954793ad552b261b0ce750c83c05e6b10b449eb9f6c02bf16f9" Mar 19 12:05:25.016383 master-0 kubenswrapper[17644]: E0319 12:05:25.016351 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0250f49b8d891954793ad552b261b0ce750c83c05e6b10b449eb9f6c02bf16f9\": container with ID starting with 0250f49b8d891954793ad552b261b0ce750c83c05e6b10b449eb9f6c02bf16f9 not found: ID does not exist" containerID="0250f49b8d891954793ad552b261b0ce750c83c05e6b10b449eb9f6c02bf16f9" Mar 19 12:05:25.016428 master-0 kubenswrapper[17644]: I0319 12:05:25.016378 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0250f49b8d891954793ad552b261b0ce750c83c05e6b10b449eb9f6c02bf16f9"} err="failed to get container status \"0250f49b8d891954793ad552b261b0ce750c83c05e6b10b449eb9f6c02bf16f9\": rpc error: code = NotFound desc = could not find container \"0250f49b8d891954793ad552b261b0ce750c83c05e6b10b449eb9f6c02bf16f9\": container with ID starting with 0250f49b8d891954793ad552b261b0ce750c83c05e6b10b449eb9f6c02bf16f9 not found: ID does not exist" Mar 19 12:05:25.016428 master-0 kubenswrapper[17644]: I0319 12:05:25.016396 17644 scope.go:117] "RemoveContainer" containerID="06671c3758623dfa519c5bba4e475806636df7ef1dd7182a02cae6c91baa2e46" Mar 19 12:05:25.016661 master-0 kubenswrapper[17644]: E0319 12:05:25.016638 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06671c3758623dfa519c5bba4e475806636df7ef1dd7182a02cae6c91baa2e46\": container with ID starting with 06671c3758623dfa519c5bba4e475806636df7ef1dd7182a02cae6c91baa2e46 not found: ID does not exist" containerID="06671c3758623dfa519c5bba4e475806636df7ef1dd7182a02cae6c91baa2e46" Mar 19 12:05:25.016703 master-0 kubenswrapper[17644]: I0319 12:05:25.016664 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06671c3758623dfa519c5bba4e475806636df7ef1dd7182a02cae6c91baa2e46"} err="failed to get container status \"06671c3758623dfa519c5bba4e475806636df7ef1dd7182a02cae6c91baa2e46\": rpc error: code = NotFound desc = could not find container \"06671c3758623dfa519c5bba4e475806636df7ef1dd7182a02cae6c91baa2e46\": container with ID starting with 06671c3758623dfa519c5bba4e475806636df7ef1dd7182a02cae6c91baa2e46 not found: ID does not exist" Mar 19 12:05:25.016703 master-0 kubenswrapper[17644]: I0319 12:05:25.016679 17644 scope.go:117] "RemoveContainer" containerID="b76fef7000b310af498f0cffcb969b0c47b51465c0a751707ee0c2ff2e63eba3" Mar 19 12:05:25.017073 master-0 kubenswrapper[17644]: E0319 12:05:25.017038 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b76fef7000b310af498f0cffcb969b0c47b51465c0a751707ee0c2ff2e63eba3\": container with ID starting with b76fef7000b310af498f0cffcb969b0c47b51465c0a751707ee0c2ff2e63eba3 not found: ID does not exist" containerID="b76fef7000b310af498f0cffcb969b0c47b51465c0a751707ee0c2ff2e63eba3" Mar 19 12:05:25.017126 master-0 kubenswrapper[17644]: I0319 12:05:25.017084 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b76fef7000b310af498f0cffcb969b0c47b51465c0a751707ee0c2ff2e63eba3"} err="failed to get container status \"b76fef7000b310af498f0cffcb969b0c47b51465c0a751707ee0c2ff2e63eba3\": rpc error: code = NotFound desc = could not find container \"b76fef7000b310af498f0cffcb969b0c47b51465c0a751707ee0c2ff2e63eba3\": container with ID starting with b76fef7000b310af498f0cffcb969b0c47b51465c0a751707ee0c2ff2e63eba3 not found: ID does not exist" Mar 19 12:05:25.017126 master-0 kubenswrapper[17644]: I0319 12:05:25.017113 17644 scope.go:117] "RemoveContainer" containerID="a9417d06413f157e4d35a2d3d830254ff285bb6abccccf700d17496320ba4ec0" Mar 19 12:05:25.017448 master-0 kubenswrapper[17644]: E0319 12:05:25.017416 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9417d06413f157e4d35a2d3d830254ff285bb6abccccf700d17496320ba4ec0\": container with ID starting with a9417d06413f157e4d35a2d3d830254ff285bb6abccccf700d17496320ba4ec0 not found: ID does not exist" containerID="a9417d06413f157e4d35a2d3d830254ff285bb6abccccf700d17496320ba4ec0" Mar 19 12:05:25.017490 master-0 kubenswrapper[17644]: I0319 12:05:25.017444 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9417d06413f157e4d35a2d3d830254ff285bb6abccccf700d17496320ba4ec0"} err="failed to get container status \"a9417d06413f157e4d35a2d3d830254ff285bb6abccccf700d17496320ba4ec0\": rpc error: code = NotFound desc = could not find container \"a9417d06413f157e4d35a2d3d830254ff285bb6abccccf700d17496320ba4ec0\": container with ID starting with a9417d06413f157e4d35a2d3d830254ff285bb6abccccf700d17496320ba4ec0 not found: ID does not exist" Mar 19 12:05:25.017490 master-0 kubenswrapper[17644]: I0319 12:05:25.017461 17644 scope.go:117] "RemoveContainer" containerID="8abd4bc13ae0709fc6342131dbb0dfd5a762a5ca0945cd22f3346298ea10ec64" Mar 19 12:05:25.017871 master-0 kubenswrapper[17644]: E0319 12:05:25.017825 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8abd4bc13ae0709fc6342131dbb0dfd5a762a5ca0945cd22f3346298ea10ec64\": container with ID starting with 8abd4bc13ae0709fc6342131dbb0dfd5a762a5ca0945cd22f3346298ea10ec64 not found: ID does not exist" containerID="8abd4bc13ae0709fc6342131dbb0dfd5a762a5ca0945cd22f3346298ea10ec64" Mar 19 12:05:25.017871 master-0 kubenswrapper[17644]: I0319 12:05:25.017846 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8abd4bc13ae0709fc6342131dbb0dfd5a762a5ca0945cd22f3346298ea10ec64"} err="failed to get container status \"8abd4bc13ae0709fc6342131dbb0dfd5a762a5ca0945cd22f3346298ea10ec64\": rpc error: code = NotFound desc = could not find container \"8abd4bc13ae0709fc6342131dbb0dfd5a762a5ca0945cd22f3346298ea10ec64\": container with ID starting with 8abd4bc13ae0709fc6342131dbb0dfd5a762a5ca0945cd22f3346298ea10ec64 not found: ID does not exist" Mar 19 12:05:25.017871 master-0 kubenswrapper[17644]: I0319 12:05:25.017863 17644 scope.go:117] "RemoveContainer" containerID="c2bb91041db17b87be85528b31c2480989756f9c7e485dd2cc9a4b6bbe2f021b" Mar 19 12:05:25.018125 master-0 kubenswrapper[17644]: I0319 12:05:25.018093 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2bb91041db17b87be85528b31c2480989756f9c7e485dd2cc9a4b6bbe2f021b"} err="failed to get container status \"c2bb91041db17b87be85528b31c2480989756f9c7e485dd2cc9a4b6bbe2f021b\": rpc error: code = NotFound desc = could not find container \"c2bb91041db17b87be85528b31c2480989756f9c7e485dd2cc9a4b6bbe2f021b\": container with ID starting with c2bb91041db17b87be85528b31c2480989756f9c7e485dd2cc9a4b6bbe2f021b not found: ID does not exist" Mar 19 12:05:25.018125 master-0 kubenswrapper[17644]: I0319 12:05:25.018116 17644 scope.go:117] "RemoveContainer" containerID="b342656179e33f18902581a908ce540ce6ef0dc91604b6d131a3f77e2a7348cf" Mar 19 12:05:25.018427 master-0 kubenswrapper[17644]: I0319 12:05:25.018398 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b342656179e33f18902581a908ce540ce6ef0dc91604b6d131a3f77e2a7348cf"} err="failed to get container status \"b342656179e33f18902581a908ce540ce6ef0dc91604b6d131a3f77e2a7348cf\": rpc error: code = NotFound desc = could not find container \"b342656179e33f18902581a908ce540ce6ef0dc91604b6d131a3f77e2a7348cf\": container with ID starting with b342656179e33f18902581a908ce540ce6ef0dc91604b6d131a3f77e2a7348cf not found: ID does not exist" Mar 19 12:05:25.018427 master-0 kubenswrapper[17644]: I0319 12:05:25.018419 17644 scope.go:117] "RemoveContainer" containerID="4aeac864a0b8c139910b7cc56c7cd968bf6d24973d0e32d571eccc06d033d0f5" Mar 19 12:05:25.018664 master-0 kubenswrapper[17644]: I0319 12:05:25.018630 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4aeac864a0b8c139910b7cc56c7cd968bf6d24973d0e32d571eccc06d033d0f5"} err="failed to get container status \"4aeac864a0b8c139910b7cc56c7cd968bf6d24973d0e32d571eccc06d033d0f5\": rpc error: code = NotFound desc = could not find container \"4aeac864a0b8c139910b7cc56c7cd968bf6d24973d0e32d571eccc06d033d0f5\": container with ID starting with 4aeac864a0b8c139910b7cc56c7cd968bf6d24973d0e32d571eccc06d033d0f5 not found: ID does not exist" Mar 19 12:05:25.018664 master-0 kubenswrapper[17644]: I0319 12:05:25.018651 17644 scope.go:117] "RemoveContainer" containerID="0250f49b8d891954793ad552b261b0ce750c83c05e6b10b449eb9f6c02bf16f9" Mar 19 12:05:25.019099 master-0 kubenswrapper[17644]: I0319 12:05:25.019068 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0250f49b8d891954793ad552b261b0ce750c83c05e6b10b449eb9f6c02bf16f9"} err="failed to get container status \"0250f49b8d891954793ad552b261b0ce750c83c05e6b10b449eb9f6c02bf16f9\": rpc error: code = NotFound desc = could not find container \"0250f49b8d891954793ad552b261b0ce750c83c05e6b10b449eb9f6c02bf16f9\": container with ID starting with 0250f49b8d891954793ad552b261b0ce750c83c05e6b10b449eb9f6c02bf16f9 not found: ID does not exist" Mar 19 12:05:25.019099 master-0 kubenswrapper[17644]: I0319 12:05:25.019090 17644 scope.go:117] "RemoveContainer" containerID="06671c3758623dfa519c5bba4e475806636df7ef1dd7182a02cae6c91baa2e46" Mar 19 12:05:25.019352 master-0 kubenswrapper[17644]: I0319 12:05:25.019319 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06671c3758623dfa519c5bba4e475806636df7ef1dd7182a02cae6c91baa2e46"} err="failed to get container status \"06671c3758623dfa519c5bba4e475806636df7ef1dd7182a02cae6c91baa2e46\": rpc error: code = NotFound desc = could not find container \"06671c3758623dfa519c5bba4e475806636df7ef1dd7182a02cae6c91baa2e46\": container with ID starting with 06671c3758623dfa519c5bba4e475806636df7ef1dd7182a02cae6c91baa2e46 not found: ID does not exist" Mar 19 12:05:25.019352 master-0 kubenswrapper[17644]: I0319 12:05:25.019343 17644 scope.go:117] "RemoveContainer" containerID="b76fef7000b310af498f0cffcb969b0c47b51465c0a751707ee0c2ff2e63eba3" Mar 19 12:05:25.019659 master-0 kubenswrapper[17644]: I0319 12:05:25.019625 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b76fef7000b310af498f0cffcb969b0c47b51465c0a751707ee0c2ff2e63eba3"} err="failed to get container status \"b76fef7000b310af498f0cffcb969b0c47b51465c0a751707ee0c2ff2e63eba3\": rpc error: code = NotFound desc = could not find container \"b76fef7000b310af498f0cffcb969b0c47b51465c0a751707ee0c2ff2e63eba3\": container with ID starting with b76fef7000b310af498f0cffcb969b0c47b51465c0a751707ee0c2ff2e63eba3 not found: ID does not exist" Mar 19 12:05:25.019659 master-0 kubenswrapper[17644]: I0319 12:05:25.019651 17644 scope.go:117] "RemoveContainer" containerID="a9417d06413f157e4d35a2d3d830254ff285bb6abccccf700d17496320ba4ec0" Mar 19 12:05:25.019929 master-0 kubenswrapper[17644]: I0319 12:05:25.019890 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9417d06413f157e4d35a2d3d830254ff285bb6abccccf700d17496320ba4ec0"} err="failed to get container status \"a9417d06413f157e4d35a2d3d830254ff285bb6abccccf700d17496320ba4ec0\": rpc error: code = NotFound desc = could not find container \"a9417d06413f157e4d35a2d3d830254ff285bb6abccccf700d17496320ba4ec0\": container with ID starting with a9417d06413f157e4d35a2d3d830254ff285bb6abccccf700d17496320ba4ec0 not found: ID does not exist" Mar 19 12:05:25.019929 master-0 kubenswrapper[17644]: I0319 12:05:25.019920 17644 scope.go:117] "RemoveContainer" containerID="8abd4bc13ae0709fc6342131dbb0dfd5a762a5ca0945cd22f3346298ea10ec64" Mar 19 12:05:25.020277 master-0 kubenswrapper[17644]: I0319 12:05:25.020233 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8abd4bc13ae0709fc6342131dbb0dfd5a762a5ca0945cd22f3346298ea10ec64"} err="failed to get container status \"8abd4bc13ae0709fc6342131dbb0dfd5a762a5ca0945cd22f3346298ea10ec64\": rpc error: code = NotFound desc = could not find container \"8abd4bc13ae0709fc6342131dbb0dfd5a762a5ca0945cd22f3346298ea10ec64\": container with ID starting with 8abd4bc13ae0709fc6342131dbb0dfd5a762a5ca0945cd22f3346298ea10ec64 not found: ID does not exist" Mar 19 12:05:26.789864 master-0 kubenswrapper[17644]: I0319 12:05:26.789703 17644 scope.go:117] "RemoveContainer" containerID="6ecb192a1cfeb4529102ad33aeed1229502ac0d4a0688a01c8e90bffa6cdc39c" Mar 19 12:05:28.183964 master-0 kubenswrapper[17644]: E0319 12:05:28.183810 17644 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189e3c9067bdfb6d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:24b4ed170d527099878cb5fdd508a2fb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Killing,Message:Stopping container etcd-rev,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 12:04:54.144015213 +0000 UTC m=+327.913973248,LastTimestamp:2026-03-19 12:04:54.144015213 +0000 UTC m=+327.913973248,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 12:05:29.932408 master-0 kubenswrapper[17644]: I0319 12:05:29.932261 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_607a35c2a34325129014a178207e606c/kube-controller-manager/0.log" Mar 19 12:05:29.932408 master-0 kubenswrapper[17644]: I0319 12:05:29.932331 17644 generic.go:334] "Generic (PLEG): container finished" podID="607a35c2a34325129014a178207e606c" containerID="ea2a9b69ce6100be8b7a5ab8a5b3754a3d903dde1f34f7b2ab132e5a43190e43" exitCode=137 Mar 19 12:05:29.932408 master-0 kubenswrapper[17644]: I0319 12:05:29.932372 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"607a35c2a34325129014a178207e606c","Type":"ContainerDied","Data":"ea2a9b69ce6100be8b7a5ab8a5b3754a3d903dde1f34f7b2ab132e5a43190e43"} Mar 19 12:05:29.933007 master-0 kubenswrapper[17644]: I0319 12:05:29.932441 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"607a35c2a34325129014a178207e606c","Type":"ContainerStarted","Data":"b444dcaee3b4ca7a60c29c3343ca436c90a224a2cac7695b9a98404124c21d5b"} Mar 19 12:05:30.881170 master-0 kubenswrapper[17644]: E0319 12:05:30.881109 17644 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:05:34.467031 master-0 kubenswrapper[17644]: E0319 12:05:34.466881 17644 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:05:34.483845 master-0 kubenswrapper[17644]: I0319 12:05:34.483684 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 19 12:05:34.511708 master-0 kubenswrapper[17644]: I0319 12:05:34.511637 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="56189b49-7d09-40c4-b5ce-258ce79f391c" Mar 19 12:05:34.511708 master-0 kubenswrapper[17644]: I0319 12:05:34.511700 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="56189b49-7d09-40c4-b5ce-258ce79f391c" Mar 19 12:05:38.998558 master-0 kubenswrapper[17644]: I0319 12:05:38.998506 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:05:38.999060 master-0 kubenswrapper[17644]: I0319 12:05:38.998597 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:05:39.003434 master-0 kubenswrapper[17644]: I0319 12:05:39.003374 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:05:40.017010 master-0 kubenswrapper[17644]: I0319 12:05:40.016892 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:05:40.882284 master-0 kubenswrapper[17644]: E0319 12:05:40.882146 17644 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:05:42.022342 master-0 kubenswrapper[17644]: I0319 12:05:42.022289 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-j528w_8438d015-106b-4aed-ae12-dda781ce51fc/approver/1.log" Mar 19 12:05:42.022949 master-0 kubenswrapper[17644]: I0319 12:05:42.022878 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-j528w_8438d015-106b-4aed-ae12-dda781ce51fc/approver/0.log" Mar 19 12:05:42.023503 master-0 kubenswrapper[17644]: I0319 12:05:42.023461 17644 generic.go:334] "Generic (PLEG): container finished" podID="8438d015-106b-4aed-ae12-dda781ce51fc" containerID="97eb1a465790bd720388085fc15badddd0717999fea7e03106e51d2d591513fd" exitCode=1 Mar 19 12:05:42.023572 master-0 kubenswrapper[17644]: I0319 12:05:42.023519 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-j528w" event={"ID":"8438d015-106b-4aed-ae12-dda781ce51fc","Type":"ContainerDied","Data":"97eb1a465790bd720388085fc15badddd0717999fea7e03106e51d2d591513fd"} Mar 19 12:05:42.023572 master-0 kubenswrapper[17644]: I0319 12:05:42.023561 17644 scope.go:117] "RemoveContainer" containerID="27aeacdf42166ebdfe7943145673659894eb1a05c94251adf45a06c9d05c04a8" Mar 19 12:05:42.024629 master-0 kubenswrapper[17644]: I0319 12:05:42.024583 17644 scope.go:117] "RemoveContainer" containerID="97eb1a465790bd720388085fc15badddd0717999fea7e03106e51d2d591513fd" Mar 19 12:05:43.032691 master-0 kubenswrapper[17644]: I0319 12:05:43.032652 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-j528w_8438d015-106b-4aed-ae12-dda781ce51fc/approver/1.log" Mar 19 12:05:43.033498 master-0 kubenswrapper[17644]: I0319 12:05:43.033458 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-j528w" event={"ID":"8438d015-106b-4aed-ae12-dda781ce51fc","Type":"ContainerStarted","Data":"c6c5ee07c7513951fc8942186c6381fa0f05c801cb4783d3a8639a668f1827f7"} Mar 19 12:05:44.467993 master-0 kubenswrapper[17644]: E0319 12:05:44.467878 17644 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:05:50.882834 master-0 kubenswrapper[17644]: E0319 12:05:50.882721 17644 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:05:50.882834 master-0 kubenswrapper[17644]: I0319 12:05:50.882813 17644 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 12:05:54.468986 master-0 kubenswrapper[17644]: E0319 12:05:54.468919 17644 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:05:54.468986 master-0 kubenswrapper[17644]: E0319 12:05:54.468963 17644 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 12:05:57.681309 master-0 kubenswrapper[17644]: I0319 12:05:57.681214 17644 status_manager.go:851] "Failed to get status for pod" podUID="e17d22fe-fe0f-448e-9666-882d888d3ad4" pod="openshift-console/downloads-66b8ffb895-vqnnc" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods downloads-66b8ffb895-vqnnc)" Mar 19 12:06:00.884041 master-0 kubenswrapper[17644]: E0319 12:06:00.883930 17644 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Mar 19 12:06:02.185745 master-0 kubenswrapper[17644]: E0319 12:06:02.185542 17644 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189e3c9067be8fb6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:24b4ed170d527099878cb5fdd508a2fb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 12:04:54.144053174 +0000 UTC m=+327.914011209,LastTimestamp:2026-03-19 12:04:54.144053174 +0000 UTC m=+327.914011209,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 12:06:08.514312 master-0 kubenswrapper[17644]: E0319 12:06:08.514243 17644 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 19 12:06:08.514911 master-0 kubenswrapper[17644]: I0319 12:06:08.514745 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 19 12:06:09.225789 master-0 kubenswrapper[17644]: I0319 12:06:09.225706 17644 generic.go:334] "Generic (PLEG): container finished" podID="094204df314fe45bd5af12ca1b4622bb" containerID="2a3ca12d1fb51188a318e1e6195d8121477d2c35e75e4a0b624b4c0d5bae5f52" exitCode=0 Mar 19 12:06:09.225789 master-0 kubenswrapper[17644]: I0319 12:06:09.225768 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerDied","Data":"2a3ca12d1fb51188a318e1e6195d8121477d2c35e75e4a0b624b4c0d5bae5f52"} Mar 19 12:06:09.226096 master-0 kubenswrapper[17644]: I0319 12:06:09.225823 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"7df8d35424120515e0cd3fed010fa127893682de4ef17d9b3582fd9f2c93ae26"} Mar 19 12:06:09.226282 master-0 kubenswrapper[17644]: I0319 12:06:09.226251 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="56189b49-7d09-40c4-b5ce-258ce79f391c" Mar 19 12:06:09.226282 master-0 kubenswrapper[17644]: I0319 12:06:09.226275 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="56189b49-7d09-40c4-b5ce-258ce79f391c" Mar 19 12:06:11.085343 master-0 kubenswrapper[17644]: E0319 12:06:11.085249 17644 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="400ms" Mar 19 12:06:13.259723 master-0 kubenswrapper[17644]: I0319 12:06:13.259586 17644 generic.go:334] "Generic (PLEG): container finished" podID="5f8c022c-7871-4765-971f-dcafa39357c9" containerID="72a73422baa1bf839575e34cbe90d73e29ac03ab1786e2499f59601d503649f6" exitCode=0 Mar 19 12:06:13.259723 master-0 kubenswrapper[17644]: I0319 12:06:13.259647 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" event={"ID":"5f8c022c-7871-4765-971f-dcafa39357c9","Type":"ContainerDied","Data":"72a73422baa1bf839575e34cbe90d73e29ac03ab1786e2499f59601d503649f6"} Mar 19 12:06:13.427518 master-0 kubenswrapper[17644]: I0319 12:06:13.427446 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 12:06:13.519950 master-0 kubenswrapper[17644]: I0319 12:06:13.519852 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-metrics-server-audit-profiles\") pod \"5f8c022c-7871-4765-971f-dcafa39357c9\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " Mar 19 12:06:13.519950 master-0 kubenswrapper[17644]: I0319 12:06:13.519943 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-server-tls\") pod \"5f8c022c-7871-4765-971f-dcafa39357c9\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " Mar 19 12:06:13.519950 master-0 kubenswrapper[17644]: I0319 12:06:13.519996 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-client-certs\") pod \"5f8c022c-7871-4765-971f-dcafa39357c9\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " Mar 19 12:06:13.520625 master-0 kubenswrapper[17644]: I0319 12:06:13.520017 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-client-ca-bundle\") pod \"5f8c022c-7871-4765-971f-dcafa39357c9\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " Mar 19 12:06:13.520625 master-0 kubenswrapper[17644]: I0319 12:06:13.520133 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g997b\" (UniqueName: \"kubernetes.io/projected/5f8c022c-7871-4765-971f-dcafa39357c9-kube-api-access-g997b\") pod \"5f8c022c-7871-4765-971f-dcafa39357c9\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " Mar 19 12:06:13.520625 master-0 kubenswrapper[17644]: I0319 12:06:13.520193 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-configmap-kubelet-serving-ca-bundle\") pod \"5f8c022c-7871-4765-971f-dcafa39357c9\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " Mar 19 12:06:13.520625 master-0 kubenswrapper[17644]: I0319 12:06:13.520234 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/5f8c022c-7871-4765-971f-dcafa39357c9-audit-log\") pod \"5f8c022c-7871-4765-971f-dcafa39357c9\" (UID: \"5f8c022c-7871-4765-971f-dcafa39357c9\") " Mar 19 12:06:13.520844 master-0 kubenswrapper[17644]: I0319 12:06:13.520816 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f8c022c-7871-4765-971f-dcafa39357c9-audit-log" (OuterVolumeSpecName: "audit-log") pod "5f8c022c-7871-4765-971f-dcafa39357c9" (UID: "5f8c022c-7871-4765-971f-dcafa39357c9"). InnerVolumeSpecName "audit-log". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:06:13.520935 master-0 kubenswrapper[17644]: I0319 12:06:13.520862 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "5f8c022c-7871-4765-971f-dcafa39357c9" (UID: "5f8c022c-7871-4765-971f-dcafa39357c9"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:06:13.521568 master-0 kubenswrapper[17644]: I0319 12:06:13.521480 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-metrics-server-audit-profiles" (OuterVolumeSpecName: "metrics-server-audit-profiles") pod "5f8c022c-7871-4765-971f-dcafa39357c9" (UID: "5f8c022c-7871-4765-971f-dcafa39357c9"). InnerVolumeSpecName "metrics-server-audit-profiles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:06:13.524252 master-0 kubenswrapper[17644]: I0319 12:06:13.524201 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-client-ca-bundle" (OuterVolumeSpecName: "client-ca-bundle") pod "5f8c022c-7871-4765-971f-dcafa39357c9" (UID: "5f8c022c-7871-4765-971f-dcafa39357c9"). InnerVolumeSpecName "client-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:06:13.524568 master-0 kubenswrapper[17644]: I0319 12:06:13.524525 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "5f8c022c-7871-4765-971f-dcafa39357c9" (UID: "5f8c022c-7871-4765-971f-dcafa39357c9"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:06:13.525241 master-0 kubenswrapper[17644]: I0319 12:06:13.525196 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f8c022c-7871-4765-971f-dcafa39357c9-kube-api-access-g997b" (OuterVolumeSpecName: "kube-api-access-g997b") pod "5f8c022c-7871-4765-971f-dcafa39357c9" (UID: "5f8c022c-7871-4765-971f-dcafa39357c9"). InnerVolumeSpecName "kube-api-access-g997b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:06:13.525373 master-0 kubenswrapper[17644]: I0319 12:06:13.525277 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-server-tls" (OuterVolumeSpecName: "secret-metrics-server-tls") pod "5f8c022c-7871-4765-971f-dcafa39357c9" (UID: "5f8c022c-7871-4765-971f-dcafa39357c9"). InnerVolumeSpecName "secret-metrics-server-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:06:13.622811 master-0 kubenswrapper[17644]: I0319 12:06:13.622603 17644 reconciler_common.go:293] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-configmap-kubelet-serving-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:06:13.622811 master-0 kubenswrapper[17644]: I0319 12:06:13.622653 17644 reconciler_common.go:293] "Volume detached for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/5f8c022c-7871-4765-971f-dcafa39357c9-audit-log\") on node \"master-0\" DevicePath \"\"" Mar 19 12:06:13.622811 master-0 kubenswrapper[17644]: I0319 12:06:13.622667 17644 reconciler_common.go:293] "Volume detached for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5f8c022c-7871-4765-971f-dcafa39357c9-metrics-server-audit-profiles\") on node \"master-0\" DevicePath \"\"" Mar 19 12:06:13.622811 master-0 kubenswrapper[17644]: I0319 12:06:13.622677 17644 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-server-tls\") on node \"master-0\" DevicePath \"\"" Mar 19 12:06:13.622811 master-0 kubenswrapper[17644]: I0319 12:06:13.622687 17644 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-secret-metrics-client-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:06:13.622811 master-0 kubenswrapper[17644]: I0319 12:06:13.622699 17644 reconciler_common.go:293] "Volume detached for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8c022c-7871-4765-971f-dcafa39357c9-client-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:06:13.622811 master-0 kubenswrapper[17644]: I0319 12:06:13.622708 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g997b\" (UniqueName: \"kubernetes.io/projected/5f8c022c-7871-4765-971f-dcafa39357c9-kube-api-access-g997b\") on node \"master-0\" DevicePath \"\"" Mar 19 12:06:14.273248 master-0 kubenswrapper[17644]: I0319 12:06:14.273142 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" event={"ID":"5f8c022c-7871-4765-971f-dcafa39357c9","Type":"ContainerDied","Data":"d26de8d7725dab288840f8eb4631a12a8821676d8fd47b0810577c9ee4f7e3b9"} Mar 19 12:06:14.274406 master-0 kubenswrapper[17644]: I0319 12:06:14.273251 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5" Mar 19 12:06:14.274406 master-0 kubenswrapper[17644]: I0319 12:06:14.273266 17644 scope.go:117] "RemoveContainer" containerID="72a73422baa1bf839575e34cbe90d73e29ac03ab1786e2499f59601d503649f6" Mar 19 12:06:14.663666 master-0 kubenswrapper[17644]: E0319 12:06:14.663279 17644 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:06:04Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:06:04Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:06:04Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:06:04Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ddc5283caf2ced75a94ddf0e8a43c431889692007e8a875a187b25c35b45a9e2\\\"],\\\"sizeBytes\\\":2895807090},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:3ea089ab116e164d89b46dc077f87d9af22f525bc2d69403214f77ee3fd30161\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:d9cbffb5a2fd538c8f19b7174d2906286acdb37a574b9dce3f9da302074591ff\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746416849},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\\\"],\\\"sizeBytes\\\":1637455533},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:c9f7bbe4799eaacbfbb60eb906000d7a813a580d6a9740def7da774cbc4cf859\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cde1da53dadc54c24c10cab8fd3e67839ce68c33ec3b556c255a79167881966a\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252053726},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a55ec7ec64efd0f595d8084377b7e463a1807829b7617e5d4a9092dcd924c36\\\"],\\\"sizeBytes\\\":1238100502},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:aefc421cf2f5dba925f7c149d56ce14e910fbd969a4e22b5917fc912ca33a5b2\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:da1ee8c9ae2cb275833f329b3d793a9109915be16d938f208ec917b50d9dd66a\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1223644894},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:af0fe0ca926422a6471d5bf22fc0e682c36c24fba05496a3bdfac0b7d3733015\\\"],\\\"sizeBytes\\\":991832673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\"],\\\"sizeBytes\\\":943841779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b00c42562d477ef44d51f35950253a26d7debc7de86e53270831aafef5795c1\\\"],\\\"sizeBytes\\\":918289953},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f2c59d19eb73ad5c0f93b0a63003c1885f5297959c9c45b401d1a74aea6e76\\\"],\\\"sizeBytes\\\":880382887},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0a09f5a3ba4f60cce0145769509bab92553c8075d210af4ac058965d2ae11efa\\\"],\\\"sizeBytes\\\":876160834},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6bba3f73c0066e42b24839e0d29f5dce2f36436f0a11f9f5e1029bccc5ed6578\\\"],\\\"sizeBytes\\\":862657321},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:de91abd5ad76fb491881a75a0feb4b8ca5600ceb5e15a4b0b687ada01ea0a44c\\\"],\\\"sizeBytes\\\":862205633},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a9e8da5c6114f062b814936d4db7a47a04d248e160d6bb28ad4e4a081496ee4\\\"],\\\"sizeBytes\\\":772943435},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1e1faad2d9167d84e23585c1cea5962301845548043cf09578f943f79ca98016\\\"],\\\"sizeBytes\\\":687949580},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5e12e4dc52214d3ada5ba5106caebe079eac1d9292c2571a5fe83411ce8e900d\\\"],\\\"sizeBytes\\\":683195416},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa5e782406f71c048b1ac3a4bf5d1227ff4be81111114083ad4c7a209c6bfb5a\\\"],\\\"sizeBytes\\\":677942383},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5bbb8535e2496de8389585ebbe696e7d7b9bad2b27785ad8a30a0fc683b0a22d\\\"],\\\"sizeBytes\\\":633877280},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ec8fd46dfb35ed10e8f98933166f69ce579c2f35b8db03d21e4c34fc544553e4\\\"],\\\"sizeBytes\\\":621648710},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f3038df8df25746bb5095296d4e5740f2356f85c1ed8d43f1b3d281e294826e5\\\"],\\\"sizeBytes\\\":605698193},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998\\\"],\\\"sizeBytes\\\":589386806},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55\\\"],\\\"sizeBytes\\\":582154903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982\\\"],\\\"sizeBytes\\\":558211175},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:112a03f2411f871cdaca5f20daef71024dac710113d5f30897117a5a02f6b6f5\\\"],\\\"sizeBytes\\\":557428271},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"],\\\"sizeBytes\\\":548752816},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\"],\\\"sizeBytes\\\":529326739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b4c9cf268bb7abef7af187cd775d3f74d0bd33626250095428d53b705ee946\\\"],\\\"sizeBytes\\\":528956487},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\"],\\\"sizeBytes\\\":518384969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:59727c4b3fef19e5149675cf3350735bbfe2c6588a57654b2e4552dd719f58b1\\\"],\\\"sizeBytes\\\":517999161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\"],\\\"sizeBytes\\\":514984269},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:30a2f97d7785ce8b0ea5115e67c4554b64adefbc7856bcf6f4fe6cc7e938a310\\\"],\\\"sizeBytes\\\":513582374},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bfe394b58ec6195de8b8420e781b7630d85a412b9112d892fea903f92b783427\\\"],\\\"sizeBytes\\\":513221333},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"],\\\"sizeBytes\\\":512274055},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98bf5467a01195e20aeea7d6f0b130ddacc00b73bc5312253b8c34e7208538f8\\\"],\\\"sizeBytes\\\":512235769},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:77fff570657d2fa0bfb709b2c8b6665bae0bf90a2be981d8dbca56c674715098\\\"],\\\"sizeBytes\\\":511227324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adb9f6f2fd701863c7caed747df43f83d3569ba9388cfa33ea7219ac6a606b11\\\"],\\\"sizeBytes\\\":511164375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458\\\"],\\\"sizeBytes\\\":508888171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263\\\"],\\\"sizeBytes\\\":508544745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71\\\"],\\\"sizeBytes\\\":507972093},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3\\\"],\\\"sizeBytes\\\":506480167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\"],\\\"sizeBytes\\\":506395599},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7d4a034950346bcd4e36e9e2f1343e0cf7a10cf544963f33d09c7eb2a1bfc634\\\"],\\\"sizeBytes\\\":505345991},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\\\"],\\\"sizeBytes\\\":505246690},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1973d56a1097a48ea0ebf2c4dbae1ed86fa67bb0116f4962f7720d48aa337d27\\\"],\\\"sizeBytes\\\":504662731},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\\\"],\\\"sizeBytes\\\":504625081},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf72297fee61ec9950f6868881ad3e84be8692ca08f084b3d155d93a766c0823\\\"],\\\"sizeBytes\\\":502712961},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69\\\"],\\\"sizeBytes\\\":495994673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023\\\"],\\\"sizeBytes\\\":495065340},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:002dfb86e17ad8f5cc232a7d2dce183b23335c8ecb7e7d31dcf3e4446b390777\\\"],\\\"sizeBytes\\\":487159945}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:06:21.486943 master-0 kubenswrapper[17644]: E0319 12:06:21.486825 17644 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="800ms" Mar 19 12:06:24.664704 master-0 kubenswrapper[17644]: E0319 12:06:24.664637 17644 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:06:32.288394 master-0 kubenswrapper[17644]: E0319 12:06:32.288280 17644 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Mar 19 12:06:34.665484 master-0 kubenswrapper[17644]: E0319 12:06:34.665418 17644 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:06:36.189003 master-0 kubenswrapper[17644]: E0319 12:06:36.188820 17644 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189e3c9067bf318d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:24b4ed170d527099878cb5fdd508a2fb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Killing,Message:Stopping container etcd-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 12:04:54.144094605 +0000 UTC m=+327.914052640,LastTimestamp:2026-03-19 12:04:54.144094605 +0000 UTC m=+327.914052640,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 12:06:43.229873 master-0 kubenswrapper[17644]: E0319 12:06:43.229781 17644 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 19 12:06:43.491151 master-0 kubenswrapper[17644]: I0319 12:06:43.491113 17644 generic.go:334] "Generic (PLEG): container finished" podID="094204df314fe45bd5af12ca1b4622bb" containerID="86048b246605f3372faf52c4f3da34632fa66c503639c3804bf00146cbe30554" exitCode=0 Mar 19 12:06:43.491276 master-0 kubenswrapper[17644]: I0319 12:06:43.491160 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerDied","Data":"86048b246605f3372faf52c4f3da34632fa66c503639c3804bf00146cbe30554"} Mar 19 12:06:43.491503 master-0 kubenswrapper[17644]: I0319 12:06:43.491474 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="56189b49-7d09-40c4-b5ce-258ce79f391c" Mar 19 12:06:43.491503 master-0 kubenswrapper[17644]: I0319 12:06:43.491495 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="56189b49-7d09-40c4-b5ce-258ce79f391c" Mar 19 12:06:43.889034 master-0 kubenswrapper[17644]: E0319 12:06:43.888890 17644 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Mar 19 12:06:44.665977 master-0 kubenswrapper[17644]: E0319 12:06:44.665906 17644 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:06:46.510428 master-0 kubenswrapper[17644]: I0319 12:06:46.510368 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-57777556ff-mjwfm_1b94d1eb-1b80-4a14-b1c0-d9e192231352/manager/1.log" Mar 19 12:06:46.511324 master-0 kubenswrapper[17644]: I0319 12:06:46.511303 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-57777556ff-mjwfm_1b94d1eb-1b80-4a14-b1c0-d9e192231352/manager/0.log" Mar 19 12:06:46.511401 master-0 kubenswrapper[17644]: I0319 12:06:46.511351 17644 generic.go:334] "Generic (PLEG): container finished" podID="1b94d1eb-1b80-4a14-b1c0-d9e192231352" containerID="54f6f1a412b81f0f7c7a43eff29ebb6260a16932752b0d9e46f5d27af722be26" exitCode=1 Mar 19 12:06:46.511460 master-0 kubenswrapper[17644]: I0319 12:06:46.511397 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" event={"ID":"1b94d1eb-1b80-4a14-b1c0-d9e192231352","Type":"ContainerDied","Data":"54f6f1a412b81f0f7c7a43eff29ebb6260a16932752b0d9e46f5d27af722be26"} Mar 19 12:06:46.511522 master-0 kubenswrapper[17644]: I0319 12:06:46.511472 17644 scope.go:117] "RemoveContainer" containerID="8fcb298ecd66e79f2851c7b4502a7734938f56462fb5de52ed324ec2a3679f14" Mar 19 12:06:46.512320 master-0 kubenswrapper[17644]: I0319 12:06:46.512278 17644 scope.go:117] "RemoveContainer" containerID="54f6f1a412b81f0f7c7a43eff29ebb6260a16932752b0d9e46f5d27af722be26" Mar 19 12:06:46.688181 master-0 kubenswrapper[17644]: I0319 12:06:46.688142 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 12:06:47.520129 master-0 kubenswrapper[17644]: I0319 12:06:47.520058 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-57777556ff-mjwfm_1b94d1eb-1b80-4a14-b1c0-d9e192231352/manager/1.log" Mar 19 12:06:47.520887 master-0 kubenswrapper[17644]: I0319 12:06:47.520395 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" event={"ID":"1b94d1eb-1b80-4a14-b1c0-d9e192231352","Type":"ContainerStarted","Data":"a5769435b8279c73564fcbcd76fae395aa4293874614dd53ac15d3f4d292e903"} Mar 19 12:06:47.520887 master-0 kubenswrapper[17644]: I0319 12:06:47.520555 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 12:06:48.529854 master-0 kubenswrapper[17644]: I0319 12:06:48.529787 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-764k4_d625c81e-01cc-424a-997d-546a5204a72b/snapshot-controller/1.log" Mar 19 12:06:48.530652 master-0 kubenswrapper[17644]: I0319 12:06:48.530442 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-764k4_d625c81e-01cc-424a-997d-546a5204a72b/snapshot-controller/0.log" Mar 19 12:06:48.530652 master-0 kubenswrapper[17644]: I0319 12:06:48.530522 17644 generic.go:334] "Generic (PLEG): container finished" podID="d625c81e-01cc-424a-997d-546a5204a72b" containerID="853d3eee88157502b76c3c9b20b3de3f2808774e2eca0856840a19b4a56c5c18" exitCode=1 Mar 19 12:06:48.530652 master-0 kubenswrapper[17644]: I0319 12:06:48.530602 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" event={"ID":"d625c81e-01cc-424a-997d-546a5204a72b","Type":"ContainerDied","Data":"853d3eee88157502b76c3c9b20b3de3f2808774e2eca0856840a19b4a56c5c18"} Mar 19 12:06:48.530990 master-0 kubenswrapper[17644]: I0319 12:06:48.530704 17644 scope.go:117] "RemoveContainer" containerID="e21a965ed4cb2dd18edb22058723998ac546681c370497fc8735a2d87bc17971" Mar 19 12:06:48.532742 master-0 kubenswrapper[17644]: I0319 12:06:48.532658 17644 scope.go:117] "RemoveContainer" containerID="853d3eee88157502b76c3c9b20b3de3f2808774e2eca0856840a19b4a56c5c18" Mar 19 12:06:49.539922 master-0 kubenswrapper[17644]: I0319 12:06:49.539863 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-764k4_d625c81e-01cc-424a-997d-546a5204a72b/snapshot-controller/1.log" Mar 19 12:06:49.540458 master-0 kubenswrapper[17644]: I0319 12:06:49.539934 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" event={"ID":"d625c81e-01cc-424a-997d-546a5204a72b","Type":"ContainerStarted","Data":"4330a60e4005ff3453f41f9e734958fd9bc2e8f3531b9166c2303ff1d9076a60"} Mar 19 12:06:54.666988 master-0 kubenswrapper[17644]: E0319 12:06:54.666931 17644 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:06:54.666988 master-0 kubenswrapper[17644]: E0319 12:06:54.666973 17644 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 12:06:56.690192 master-0 kubenswrapper[17644]: I0319 12:06:56.690147 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-mjwfm" Mar 19 12:06:57.090342 master-0 kubenswrapper[17644]: E0319 12:06:57.090108 17644 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" interval="6.4s" Mar 19 12:06:57.689214 master-0 kubenswrapper[17644]: I0319 12:06:57.689128 17644 status_manager.go:851] "Failed to get status for pod" podUID="dff9f91a-2293-4b2d-95dd-be0f9152984e" pod="openshift-kube-apiserver/installer-6-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-6-master-0)" Mar 19 12:06:58.655648 master-0 kubenswrapper[17644]: I0319 12:06:58.655548 17644 generic.go:334] "Generic (PLEG): container finished" podID="b3de8a1b-a5be-414f-86e8-738e16c8bc97" containerID="78a6cddfc0e0acdf16c683d2f70de891e291d18f727a26ee57b67fdf44168c74" exitCode=0 Mar 19 12:06:58.656367 master-0 kubenswrapper[17644]: I0319 12:06:58.655653 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" event={"ID":"b3de8a1b-a5be-414f-86e8-738e16c8bc97","Type":"ContainerDied","Data":"78a6cddfc0e0acdf16c683d2f70de891e291d18f727a26ee57b67fdf44168c74"} Mar 19 12:06:58.656367 master-0 kubenswrapper[17644]: I0319 12:06:58.655771 17644 scope.go:117] "RemoveContainer" containerID="147ea002de1d61b828f3e4f59b89474a76a533a161c3a8b138665844ccc9c433" Mar 19 12:06:58.656593 master-0 kubenswrapper[17644]: I0319 12:06:58.656554 17644 scope.go:117] "RemoveContainer" containerID="78a6cddfc0e0acdf16c683d2f70de891e291d18f727a26ee57b67fdf44168c74" Mar 19 12:06:58.656881 master-0 kubenswrapper[17644]: E0319 12:06:58.656823 17644 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-89ccd998f-bftt4_openshift-marketplace(b3de8a1b-a5be-414f-86e8-738e16c8bc97)\"" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" podUID="b3de8a1b-a5be-414f-86e8-738e16c8bc97" Mar 19 12:06:59.665837 master-0 kubenswrapper[17644]: I0319 12:06:59.665775 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-87z86_c8d8a09f-22d5-4f16-84d6-d5f2c504c949/config-sync-controllers/0.log" Mar 19 12:06:59.666647 master-0 kubenswrapper[17644]: I0319 12:06:59.666594 17644 generic.go:334] "Generic (PLEG): container finished" podID="c8d8a09f-22d5-4f16-84d6-d5f2c504c949" containerID="6a8ee95c82e6b677420027c38c1c68131911d17ea065c53213f6254b809ed080" exitCode=1 Mar 19 12:06:59.666803 master-0 kubenswrapper[17644]: I0319 12:06:59.666670 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" event={"ID":"c8d8a09f-22d5-4f16-84d6-d5f2c504c949","Type":"ContainerDied","Data":"6a8ee95c82e6b677420027c38c1c68131911d17ea065c53213f6254b809ed080"} Mar 19 12:06:59.667473 master-0 kubenswrapper[17644]: I0319 12:06:59.667457 17644 scope.go:117] "RemoveContainer" containerID="6a8ee95c82e6b677420027c38c1c68131911d17ea065c53213f6254b809ed080" Mar 19 12:07:00.677851 master-0 kubenswrapper[17644]: I0319 12:07:00.677796 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-87z86_c8d8a09f-22d5-4f16-84d6-d5f2c504c949/config-sync-controllers/0.log" Mar 19 12:07:00.678568 master-0 kubenswrapper[17644]: I0319 12:07:00.678530 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" event={"ID":"c8d8a09f-22d5-4f16-84d6-d5f2c504c949","Type":"ContainerStarted","Data":"38ae5309a5240e2771d671a20d3e119fcc57a893b379503de62cf347523bd835"} Mar 19 12:07:03.698463 master-0 kubenswrapper[17644]: I0319 12:07:03.698417 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-6864dc98f7-xzxpq_376b18a9-5f33-44fd-a37b-20ab02c5e65d/manager/1.log" Mar 19 12:07:03.699003 master-0 kubenswrapper[17644]: I0319 12:07:03.698943 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-6864dc98f7-xzxpq_376b18a9-5f33-44fd-a37b-20ab02c5e65d/manager/0.log" Mar 19 12:07:03.699628 master-0 kubenswrapper[17644]: I0319 12:07:03.699589 17644 generic.go:334] "Generic (PLEG): container finished" podID="376b18a9-5f33-44fd-a37b-20ab02c5e65d" containerID="2488db84b0849c81166877e395ec16ae06df9df840cc1e0200c1e2aef0f75b5f" exitCode=1 Mar 19 12:07:03.699701 master-0 kubenswrapper[17644]: I0319 12:07:03.699642 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" event={"ID":"376b18a9-5f33-44fd-a37b-20ab02c5e65d","Type":"ContainerDied","Data":"2488db84b0849c81166877e395ec16ae06df9df840cc1e0200c1e2aef0f75b5f"} Mar 19 12:07:03.699701 master-0 kubenswrapper[17644]: I0319 12:07:03.699683 17644 scope.go:117] "RemoveContainer" containerID="4c09f5575088b49e0ef7e52a5eb347dfd8470474e6a6ff5bf019908a8d6b87bc" Mar 19 12:07:03.700864 master-0 kubenswrapper[17644]: I0319 12:07:03.700805 17644 scope.go:117] "RemoveContainer" containerID="2488db84b0849c81166877e395ec16ae06df9df840cc1e0200c1e2aef0f75b5f" Mar 19 12:07:04.710904 master-0 kubenswrapper[17644]: I0319 12:07:04.709193 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-6864dc98f7-xzxpq_376b18a9-5f33-44fd-a37b-20ab02c5e65d/manager/1.log" Mar 19 12:07:04.710904 master-0 kubenswrapper[17644]: I0319 12:07:04.709585 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" event={"ID":"376b18a9-5f33-44fd-a37b-20ab02c5e65d","Type":"ContainerStarted","Data":"68341d5dde661e41cb74b28be5e218c393624c4b77bfb17252ee57f1d6c1b665"} Mar 19 12:07:04.710904 master-0 kubenswrapper[17644]: I0319 12:07:04.709926 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 12:07:06.664521 master-0 kubenswrapper[17644]: I0319 12:07:06.664429 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 12:07:06.666071 master-0 kubenswrapper[17644]: I0319 12:07:06.664565 17644 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 12:07:06.666071 master-0 kubenswrapper[17644]: I0319 12:07:06.665354 17644 scope.go:117] "RemoveContainer" containerID="78a6cddfc0e0acdf16c683d2f70de891e291d18f727a26ee57b67fdf44168c74" Mar 19 12:07:06.666071 master-0 kubenswrapper[17644]: E0319 12:07:06.665930 17644 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-89ccd998f-bftt4_openshift-marketplace(b3de8a1b-a5be-414f-86e8-738e16c8bc97)\"" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" podUID="b3de8a1b-a5be-414f-86e8-738e16c8bc97" Mar 19 12:07:06.727339 master-0 kubenswrapper[17644]: I0319 12:07:06.727284 17644 scope.go:117] "RemoveContainer" containerID="78a6cddfc0e0acdf16c683d2f70de891e291d18f727a26ee57b67fdf44168c74" Mar 19 12:07:06.728085 master-0 kubenswrapper[17644]: E0319 12:07:06.728060 17644 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-89ccd998f-bftt4_openshift-marketplace(b3de8a1b-a5be-414f-86e8-738e16c8bc97)\"" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" podUID="b3de8a1b-a5be-414f-86e8-738e16c8bc97" Mar 19 12:07:07.737959 master-0 kubenswrapper[17644]: I0319 12:07:07.737909 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-87z86_c8d8a09f-22d5-4f16-84d6-d5f2c504c949/config-sync-controllers/0.log" Mar 19 12:07:07.739900 master-0 kubenswrapper[17644]: I0319 12:07:07.739843 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-87z86_c8d8a09f-22d5-4f16-84d6-d5f2c504c949/cluster-cloud-controller-manager/0.log" Mar 19 12:07:07.740029 master-0 kubenswrapper[17644]: I0319 12:07:07.739931 17644 generic.go:334] "Generic (PLEG): container finished" podID="c8d8a09f-22d5-4f16-84d6-d5f2c504c949" containerID="934c2b00a5a26c98555551f393b5ebfacf94aab19ba7eb619ef808c070c8dab1" exitCode=1 Mar 19 12:07:07.740029 master-0 kubenswrapper[17644]: I0319 12:07:07.739976 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" event={"ID":"c8d8a09f-22d5-4f16-84d6-d5f2c504c949","Type":"ContainerDied","Data":"934c2b00a5a26c98555551f393b5ebfacf94aab19ba7eb619ef808c070c8dab1"} Mar 19 12:07:07.740716 master-0 kubenswrapper[17644]: I0319 12:07:07.740645 17644 scope.go:117] "RemoveContainer" containerID="934c2b00a5a26c98555551f393b5ebfacf94aab19ba7eb619ef808c070c8dab1" Mar 19 12:07:08.003963 master-0 kubenswrapper[17644]: I0319 12:07:08.003857 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-xzxpq" Mar 19 12:07:08.752604 master-0 kubenswrapper[17644]: I0319 12:07:08.752548 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-87z86_c8d8a09f-22d5-4f16-84d6-d5f2c504c949/config-sync-controllers/0.log" Mar 19 12:07:08.753319 master-0 kubenswrapper[17644]: I0319 12:07:08.753134 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-87z86_c8d8a09f-22d5-4f16-84d6-d5f2c504c949/cluster-cloud-controller-manager/0.log" Mar 19 12:07:08.753319 master-0 kubenswrapper[17644]: I0319 12:07:08.753207 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-87z86" event={"ID":"c8d8a09f-22d5-4f16-84d6-d5f2c504c949","Type":"ContainerStarted","Data":"43ea2d45f477510e8502ad46f4434402816be56f8d27ba9dc9706c868ea3da9a"} Mar 19 12:07:10.191945 master-0 kubenswrapper[17644]: E0319 12:07:10.191783 17644 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{downloads-66b8ffb895-vqnnc.189e3c911782753f openshift-console 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-console,Name:downloads-66b8ffb895-vqnnc,UID:e17d22fe-fe0f-448e-9666-882d888d3ad4,APIVersion:v1,ResourceVersion:14739,FieldPath:spec.containers{download-server},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ddc5283caf2ced75a94ddf0e8a43c431889692007e8a875a187b25c35b45a9e2\" in 44.647s (44.647s including waiting). Image size: 2895807090 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 12:04:57.092904255 +0000 UTC m=+330.862862300,LastTimestamp:2026-03-19 12:04:57.092904255 +0000 UTC m=+330.862862300,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 12:07:13.491357 master-0 kubenswrapper[17644]: E0319 12:07:13.491226 17644 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 19 12:07:14.945886 master-0 kubenswrapper[17644]: E0319 12:07:14.945635 17644 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:07:04Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:07:04Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:07:04Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:07:04Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ddc5283caf2ced75a94ddf0e8a43c431889692007e8a875a187b25c35b45a9e2\\\"],\\\"sizeBytes\\\":2895807090},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:3ea089ab116e164d89b46dc077f87d9af22f525bc2d69403214f77ee3fd30161\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:d9cbffb5a2fd538c8f19b7174d2906286acdb37a574b9dce3f9da302074591ff\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746416849},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\\\"],\\\"sizeBytes\\\":1637455533},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:c9f7bbe4799eaacbfbb60eb906000d7a813a580d6a9740def7da774cbc4cf859\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cde1da53dadc54c24c10cab8fd3e67839ce68c33ec3b556c255a79167881966a\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252053726},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a55ec7ec64efd0f595d8084377b7e463a1807829b7617e5d4a9092dcd924c36\\\"],\\\"sizeBytes\\\":1238100502},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:aefc421cf2f5dba925f7c149d56ce14e910fbd969a4e22b5917fc912ca33a5b2\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:da1ee8c9ae2cb275833f329b3d793a9109915be16d938f208ec917b50d9dd66a\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1223644894},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:af0fe0ca926422a6471d5bf22fc0e682c36c24fba05496a3bdfac0b7d3733015\\\"],\\\"sizeBytes\\\":991832673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\"],\\\"sizeBytes\\\":943841779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b00c42562d477ef44d51f35950253a26d7debc7de86e53270831aafef5795c1\\\"],\\\"sizeBytes\\\":918289953},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f2c59d19eb73ad5c0f93b0a63003c1885f5297959c9c45b401d1a74aea6e76\\\"],\\\"sizeBytes\\\":880382887},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0a09f5a3ba4f60cce0145769509bab92553c8075d210af4ac058965d2ae11efa\\\"],\\\"sizeBytes\\\":876160834},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6bba3f73c0066e42b24839e0d29f5dce2f36436f0a11f9f5e1029bccc5ed6578\\\"],\\\"sizeBytes\\\":862657321},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:de91abd5ad76fb491881a75a0feb4b8ca5600ceb5e15a4b0b687ada01ea0a44c\\\"],\\\"sizeBytes\\\":862205633},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a9e8da5c6114f062b814936d4db7a47a04d248e160d6bb28ad4e4a081496ee4\\\"],\\\"sizeBytes\\\":772943435},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1e1faad2d9167d84e23585c1cea5962301845548043cf09578f943f79ca98016\\\"],\\\"sizeBytes\\\":687949580},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5e12e4dc52214d3ada5ba5106caebe079eac1d9292c2571a5fe83411ce8e900d\\\"],\\\"sizeBytes\\\":683195416},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa5e782406f71c048b1ac3a4bf5d1227ff4be81111114083ad4c7a209c6bfb5a\\\"],\\\"sizeBytes\\\":677942383},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5bbb8535e2496de8389585ebbe696e7d7b9bad2b27785ad8a30a0fc683b0a22d\\\"],\\\"sizeBytes\\\":633877280},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ec8fd46dfb35ed10e8f98933166f69ce579c2f35b8db03d21e4c34fc544553e4\\\"],\\\"sizeBytes\\\":621648710},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f3038df8df25746bb5095296d4e5740f2356f85c1ed8d43f1b3d281e294826e5\\\"],\\\"sizeBytes\\\":605698193},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998\\\"],\\\"sizeBytes\\\":589386806},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55\\\"],\\\"sizeBytes\\\":582154903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982\\\"],\\\"sizeBytes\\\":558211175},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:112a03f2411f871cdaca5f20daef71024dac710113d5f30897117a5a02f6b6f5\\\"],\\\"sizeBytes\\\":557428271},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"],\\\"sizeBytes\\\":548752816},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\"],\\\"sizeBytes\\\":529326739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b4c9cf268bb7abef7af187cd775d3f74d0bd33626250095428d53b705ee946\\\"],\\\"sizeBytes\\\":528956487},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\"],\\\"sizeBytes\\\":518384969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:59727c4b3fef19e5149675cf3350735bbfe2c6588a57654b2e4552dd719f58b1\\\"],\\\"sizeBytes\\\":517999161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\"],\\\"sizeBytes\\\":514984269},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:30a2f97d7785ce8b0ea5115e67c4554b64adefbc7856bcf6f4fe6cc7e938a310\\\"],\\\"sizeBytes\\\":513582374},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bfe394b58ec6195de8b8420e781b7630d85a412b9112d892fea903f92b783427\\\"],\\\"sizeBytes\\\":513221333},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"],\\\"sizeBytes\\\":512274055},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98bf5467a01195e20aeea7d6f0b130ddacc00b73bc5312253b8c34e7208538f8\\\"],\\\"sizeBytes\\\":512235769},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:77fff570657d2fa0bfb709b2c8b6665bae0bf90a2be981d8dbca56c674715098\\\"],\\\"sizeBytes\\\":511227324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adb9f6f2fd701863c7caed747df43f83d3569ba9388cfa33ea7219ac6a606b11\\\"],\\\"sizeBytes\\\":511164375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458\\\"],\\\"sizeBytes\\\":508888171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263\\\"],\\\"sizeBytes\\\":508544745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71\\\"],\\\"sizeBytes\\\":507972093},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3\\\"],\\\"sizeBytes\\\":506480167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\"],\\\"sizeBytes\\\":506395599},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7d4a034950346bcd4e36e9e2f1343e0cf7a10cf544963f33d09c7eb2a1bfc634\\\"],\\\"sizeBytes\\\":505345991},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\\\"],\\\"sizeBytes\\\":505246690},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1973d56a1097a48ea0ebf2c4dbae1ed86fa67bb0116f4962f7720d48aa337d27\\\"],\\\"sizeBytes\\\":504662731},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\\\"],\\\"sizeBytes\\\":504625081},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf72297fee61ec9950f6868881ad3e84be8692ca08f084b3d155d93a766c0823\\\"],\\\"sizeBytes\\\":502712961},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69\\\"],\\\"sizeBytes\\\":495994673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023\\\"],\\\"sizeBytes\\\":495065340},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:002dfb86e17ad8f5cc232a7d2dce183b23335c8ecb7e7d31dcf3e4446b390777\\\"],\\\"sizeBytes\\\":487159945}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:07:17.494130 master-0 kubenswrapper[17644]: E0319 12:07:17.494061 17644 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 19 12:07:17.841918 master-0 kubenswrapper[17644]: I0319 12:07:17.841871 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_607a35c2a34325129014a178207e606c/kube-controller-manager/0.log" Mar 19 12:07:17.842068 master-0 kubenswrapper[17644]: I0319 12:07:17.841942 17644 generic.go:334] "Generic (PLEG): container finished" podID="607a35c2a34325129014a178207e606c" containerID="f7c540b0641ec46c201cc061924c6ea67fd66d520e406a823de347faf358f648" exitCode=0 Mar 19 12:07:17.842068 master-0 kubenswrapper[17644]: I0319 12:07:17.841987 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"607a35c2a34325129014a178207e606c","Type":"ContainerDied","Data":"f7c540b0641ec46c201cc061924c6ea67fd66d520e406a823de347faf358f648"} Mar 19 12:07:17.842938 master-0 kubenswrapper[17644]: I0319 12:07:17.842898 17644 scope.go:117] "RemoveContainer" containerID="f7c540b0641ec46c201cc061924c6ea67fd66d520e406a823de347faf358f648" Mar 19 12:07:17.847011 master-0 kubenswrapper[17644]: I0319 12:07:17.846949 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"11d71ac65553861d6e6e6e2640dcd0dc181545ebb7fde7d453d25fadc2d7dd2d"} Mar 19 12:07:17.847439 master-0 kubenswrapper[17644]: I0319 12:07:17.847367 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="56189b49-7d09-40c4-b5ce-258ce79f391c" Mar 19 12:07:17.847439 master-0 kubenswrapper[17644]: I0319 12:07:17.847387 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="56189b49-7d09-40c4-b5ce-258ce79f391c" Mar 19 12:07:18.859337 master-0 kubenswrapper[17644]: I0319 12:07:18.859272 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_607a35c2a34325129014a178207e606c/kube-controller-manager/0.log" Mar 19 12:07:18.859931 master-0 kubenswrapper[17644]: I0319 12:07:18.859396 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"607a35c2a34325129014a178207e606c","Type":"ContainerStarted","Data":"9d993cd1a5833f4165f6cc8d12eb8bf154b773e372dc7ee598d180665ac9fcd8"} Mar 19 12:07:18.862532 master-0 kubenswrapper[17644]: I0319 12:07:18.862472 17644 generic.go:334] "Generic (PLEG): container finished" podID="094204df314fe45bd5af12ca1b4622bb" containerID="11d71ac65553861d6e6e6e2640dcd0dc181545ebb7fde7d453d25fadc2d7dd2d" exitCode=0 Mar 19 12:07:18.862605 master-0 kubenswrapper[17644]: I0319 12:07:18.862579 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerDied","Data":"11d71ac65553861d6e6e6e2640dcd0dc181545ebb7fde7d453d25fadc2d7dd2d"} Mar 19 12:07:18.864827 master-0 kubenswrapper[17644]: I0319 12:07:18.864720 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-764k4_d625c81e-01cc-424a-997d-546a5204a72b/snapshot-controller/2.log" Mar 19 12:07:18.865604 master-0 kubenswrapper[17644]: I0319 12:07:18.865582 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-764k4_d625c81e-01cc-424a-997d-546a5204a72b/snapshot-controller/1.log" Mar 19 12:07:18.865765 master-0 kubenswrapper[17644]: I0319 12:07:18.865744 17644 generic.go:334] "Generic (PLEG): container finished" podID="d625c81e-01cc-424a-997d-546a5204a72b" containerID="4330a60e4005ff3453f41f9e734958fd9bc2e8f3531b9166c2303ff1d9076a60" exitCode=1 Mar 19 12:07:18.865863 master-0 kubenswrapper[17644]: I0319 12:07:18.865824 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" event={"ID":"d625c81e-01cc-424a-997d-546a5204a72b","Type":"ContainerDied","Data":"4330a60e4005ff3453f41f9e734958fd9bc2e8f3531b9166c2303ff1d9076a60"} Mar 19 12:07:18.866167 master-0 kubenswrapper[17644]: I0319 12:07:18.866152 17644 scope.go:117] "RemoveContainer" containerID="853d3eee88157502b76c3c9b20b3de3f2808774e2eca0856840a19b4a56c5c18" Mar 19 12:07:18.868098 master-0 kubenswrapper[17644]: I0319 12:07:18.867591 17644 scope.go:117] "RemoveContainer" containerID="4330a60e4005ff3453f41f9e734958fd9bc2e8f3531b9166c2303ff1d9076a60" Mar 19 12:07:18.873452 master-0 kubenswrapper[17644]: E0319 12:07:18.873368 17644 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-764k4_openshift-cluster-storage-operator(d625c81e-01cc-424a-997d-546a5204a72b)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" podUID="d625c81e-01cc-424a-997d-546a5204a72b" Mar 19 12:07:18.999123 master-0 kubenswrapper[17644]: I0319 12:07:18.998663 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:07:18.999123 master-0 kubenswrapper[17644]: I0319 12:07:18.998833 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:07:19.880379 master-0 kubenswrapper[17644]: I0319 12:07:19.879370 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-764k4_d625c81e-01cc-424a-997d-546a5204a72b/snapshot-controller/2.log" Mar 19 12:07:21.484452 master-0 kubenswrapper[17644]: I0319 12:07:21.484363 17644 scope.go:117] "RemoveContainer" containerID="78a6cddfc0e0acdf16c683d2f70de891e291d18f727a26ee57b67fdf44168c74" Mar 19 12:07:21.899634 master-0 kubenswrapper[17644]: I0319 12:07:21.899442 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" event={"ID":"b3de8a1b-a5be-414f-86e8-738e16c8bc97","Type":"ContainerStarted","Data":"8f41a698bbff9ea1def7f5af79fd7e5b9e05dd5ad8cb6f6eacff40ff3e028aab"} Mar 19 12:07:21.900021 master-0 kubenswrapper[17644]: I0319 12:07:21.899956 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 12:07:21.903108 master-0 kubenswrapper[17644]: I0319 12:07:21.903064 17644 patch_prober.go:28] interesting pod/marketplace-operator-89ccd998f-bftt4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" start-of-body= Mar 19 12:07:21.903203 master-0 kubenswrapper[17644]: I0319 12:07:21.903123 17644 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" podUID="b3de8a1b-a5be-414f-86e8-738e16c8bc97" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" Mar 19 12:07:21.999430 master-0 kubenswrapper[17644]: I0319 12:07:21.999333 17644 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 12:07:21.999835 master-0 kubenswrapper[17644]: I0319 12:07:21.999460 17644 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="607a35c2a34325129014a178207e606c" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 12:07:22.911804 master-0 kubenswrapper[17644]: I0319 12:07:22.911747 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-89ccd998f-bftt4" Mar 19 12:07:23.918400 master-0 kubenswrapper[17644]: I0319 12:07:23.918327 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-5c6485487f-5zvc5_a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc/machine-approver-controller/0.log" Mar 19 12:07:23.919353 master-0 kubenswrapper[17644]: I0319 12:07:23.918706 17644 generic.go:334] "Generic (PLEG): container finished" podID="a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc" containerID="5a48c0f16a923ed0b10bf8df2ccd8ed50c32745daa2a915ec8165d2602a44666" exitCode=255 Mar 19 12:07:23.919791 master-0 kubenswrapper[17644]: I0319 12:07:23.919749 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" event={"ID":"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc","Type":"ContainerDied","Data":"5a48c0f16a923ed0b10bf8df2ccd8ed50c32745daa2a915ec8165d2602a44666"} Mar 19 12:07:23.920413 master-0 kubenswrapper[17644]: I0319 12:07:23.920379 17644 scope.go:117] "RemoveContainer" containerID="5a48c0f16a923ed0b10bf8df2ccd8ed50c32745daa2a915ec8165d2602a44666" Mar 19 12:07:24.928745 master-0 kubenswrapper[17644]: I0319 12:07:24.928672 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-5c6485487f-5zvc5_a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc/machine-approver-controller/0.log" Mar 19 12:07:24.929321 master-0 kubenswrapper[17644]: I0319 12:07:24.929130 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-5zvc5" event={"ID":"a198a18a-4dd8-4c35-b15e-2ed8acfe9bbc","Type":"ContainerStarted","Data":"619a16260c64c7518af5631fc0e7c160570e91f817044b6d9b6081cbe5d2eae6"} Mar 19 12:07:24.946538 master-0 kubenswrapper[17644]: E0319 12:07:24.946466 17644 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:07:29.963709 master-0 kubenswrapper[17644]: I0319 12:07:29.963614 17644 generic.go:334] "Generic (PLEG): container finished" podID="97f5b7e8-eee9-42b1-a23e-8a74f1ce4585" containerID="6ed56431e7e3a29594e8c55d24af97e05dc53fc52776fe94fedb9d579e864bcd" exitCode=0 Mar 19 12:07:29.963709 master-0 kubenswrapper[17644]: I0319 12:07:29.963694 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" event={"ID":"97f5b7e8-eee9-42b1-a23e-8a74f1ce4585","Type":"ContainerDied","Data":"6ed56431e7e3a29594e8c55d24af97e05dc53fc52776fe94fedb9d579e864bcd"} Mar 19 12:07:29.964598 master-0 kubenswrapper[17644]: I0319 12:07:29.964538 17644 scope.go:117] "RemoveContainer" containerID="6ed56431e7e3a29594e8c55d24af97e05dc53fc52776fe94fedb9d579e864bcd" Mar 19 12:07:30.484865 master-0 kubenswrapper[17644]: I0319 12:07:30.484714 17644 scope.go:117] "RemoveContainer" containerID="4330a60e4005ff3453f41f9e734958fd9bc2e8f3531b9166c2303ff1d9076a60" Mar 19 12:07:30.492555 master-0 kubenswrapper[17644]: E0319 12:07:30.492487 17644 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 19 12:07:30.977579 master-0 kubenswrapper[17644]: I0319 12:07:30.977506 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-764k4_d625c81e-01cc-424a-997d-546a5204a72b/snapshot-controller/2.log" Mar 19 12:07:30.978588 master-0 kubenswrapper[17644]: I0319 12:07:30.977682 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" event={"ID":"d625c81e-01cc-424a-997d-546a5204a72b","Type":"ContainerStarted","Data":"d7edf97138edd898b8970970ba2baab9570971be057da4099c12f0fae904c734"} Mar 19 12:07:30.980962 master-0 kubenswrapper[17644]: I0319 12:07:30.980873 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" event={"ID":"97f5b7e8-eee9-42b1-a23e-8a74f1ce4585","Type":"ContainerStarted","Data":"89e2fb0436aaf5420411b2928148d341d7b6c5d6dab580ec01bca9d61919925b"} Mar 19 12:07:30.981565 master-0 kubenswrapper[17644]: I0319 12:07:30.981504 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" Mar 19 12:07:30.986412 master-0 kubenswrapper[17644]: I0319 12:07:30.986335 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" Mar 19 12:07:32.000164 master-0 kubenswrapper[17644]: I0319 12:07:32.000054 17644 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 12:07:32.001142 master-0 kubenswrapper[17644]: I0319 12:07:32.000184 17644 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="607a35c2a34325129014a178207e606c" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 12:07:32.007144 master-0 kubenswrapper[17644]: I0319 12:07:32.007081 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-942g6_92e401a4-ed2f-46f7-924b-329d7b313e6a/cluster-baremetal-operator/0.log" Mar 19 12:07:32.007289 master-0 kubenswrapper[17644]: I0319 12:07:32.007160 17644 generic.go:334] "Generic (PLEG): container finished" podID="92e401a4-ed2f-46f7-924b-329d7b313e6a" containerID="46876a7e063d974c121cff378937380f72a9002e08dc430717d4d702ce311e44" exitCode=1 Mar 19 12:07:32.007289 master-0 kubenswrapper[17644]: I0319 12:07:32.007221 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" event={"ID":"92e401a4-ed2f-46f7-924b-329d7b313e6a","Type":"ContainerDied","Data":"46876a7e063d974c121cff378937380f72a9002e08dc430717d4d702ce311e44"} Mar 19 12:07:32.008358 master-0 kubenswrapper[17644]: I0319 12:07:32.008320 17644 scope.go:117] "RemoveContainer" containerID="46876a7e063d974c121cff378937380f72a9002e08dc430717d4d702ce311e44" Mar 19 12:07:33.017359 master-0 kubenswrapper[17644]: I0319 12:07:33.017311 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-942g6_92e401a4-ed2f-46f7-924b-329d7b313e6a/cluster-baremetal-operator/0.log" Mar 19 12:07:33.017939 master-0 kubenswrapper[17644]: I0319 12:07:33.017613 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" event={"ID":"92e401a4-ed2f-46f7-924b-329d7b313e6a","Type":"ContainerStarted","Data":"32d00ec9f79f3b36c30f20da42acbe4672f3d595a12272450bf9df8eb150b8ba"} Mar 19 12:07:34.026404 master-0 kubenswrapper[17644]: I0319 12:07:34.026340 17644 generic.go:334] "Generic (PLEG): container finished" podID="daf4dbb6-5a0a-4c92-a930-479a7330ace1" containerID="e906b08d672ce284b02875a6853f0e751eac277b721f1ba103b6a1fce5dcd578" exitCode=0 Mar 19 12:07:34.026404 master-0 kubenswrapper[17644]: I0319 12:07:34.026395 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" event={"ID":"daf4dbb6-5a0a-4c92-a930-479a7330ace1","Type":"ContainerDied","Data":"e906b08d672ce284b02875a6853f0e751eac277b721f1ba103b6a1fce5dcd578"} Mar 19 12:07:34.027022 master-0 kubenswrapper[17644]: I0319 12:07:34.026432 17644 scope.go:117] "RemoveContainer" containerID="1961370e7c6f3b39c50205c4d3f694632a63f87701e4b9c1a6a05e005ec065b1" Mar 19 12:07:34.027203 master-0 kubenswrapper[17644]: I0319 12:07:34.027162 17644 scope.go:117] "RemoveContainer" containerID="e906b08d672ce284b02875a6853f0e751eac277b721f1ba103b6a1fce5dcd578" Mar 19 12:07:34.947488 master-0 kubenswrapper[17644]: E0319 12:07:34.947388 17644 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:07:35.038259 master-0 kubenswrapper[17644]: I0319 12:07:35.038186 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-zs6dd" event={"ID":"daf4dbb6-5a0a-4c92-a930-479a7330ace1","Type":"ContainerStarted","Data":"1706ce04fdf03defc0b18f7e5b6e9263efef6ca7e6f82560efeee404991b3102"} Mar 19 12:07:36.051660 master-0 kubenswrapper[17644]: I0319 12:07:36.051609 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8dd3d3608fe9c86b0f65904ec2353df4/kube-scheduler/0.log" Mar 19 12:07:36.052282 master-0 kubenswrapper[17644]: I0319 12:07:36.052083 17644 generic.go:334] "Generic (PLEG): container finished" podID="8dd3d3608fe9c86b0f65904ec2353df4" containerID="629d5b962178c5cc8cd88274e6fb264cf46c233c7dcebb5d39e1e6499d1fc8bf" exitCode=1 Mar 19 12:07:36.052282 master-0 kubenswrapper[17644]: I0319 12:07:36.052215 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8dd3d3608fe9c86b0f65904ec2353df4","Type":"ContainerDied","Data":"629d5b962178c5cc8cd88274e6fb264cf46c233c7dcebb5d39e1e6499d1fc8bf"} Mar 19 12:07:36.053185 master-0 kubenswrapper[17644]: I0319 12:07:36.053138 17644 scope.go:117] "RemoveContainer" containerID="629d5b962178c5cc8cd88274e6fb264cf46c233c7dcebb5d39e1e6499d1fc8bf" Mar 19 12:07:37.062956 master-0 kubenswrapper[17644]: I0319 12:07:37.062899 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8dd3d3608fe9c86b0f65904ec2353df4/kube-scheduler/0.log" Mar 19 12:07:37.063552 master-0 kubenswrapper[17644]: I0319 12:07:37.063418 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8dd3d3608fe9c86b0f65904ec2353df4","Type":"ContainerStarted","Data":"b3485db6aaddee4e4d797b661b5dd9b9c4e87879acbe3a148927365e2fb64f17"} Mar 19 12:07:37.063819 master-0 kubenswrapper[17644]: I0319 12:07:37.063785 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:07:41.096313 master-0 kubenswrapper[17644]: I0319 12:07:41.096100 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6f97756bc8-j7rc9_7a51eeaf-1349-4bf3-932d-22ed5ce7c161/control-plane-machine-set-operator/0.log" Mar 19 12:07:41.096313 master-0 kubenswrapper[17644]: I0319 12:07:41.096178 17644 generic.go:334] "Generic (PLEG): container finished" podID="7a51eeaf-1349-4bf3-932d-22ed5ce7c161" containerID="fba15ac5fd8638fa2d8fe5188431cf574d56ecf14fb3b1611a5b61dc6246db85" exitCode=1 Mar 19 12:07:41.096313 master-0 kubenswrapper[17644]: I0319 12:07:41.096220 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-j7rc9" event={"ID":"7a51eeaf-1349-4bf3-932d-22ed5ce7c161","Type":"ContainerDied","Data":"fba15ac5fd8638fa2d8fe5188431cf574d56ecf14fb3b1611a5b61dc6246db85"} Mar 19 12:07:41.097973 master-0 kubenswrapper[17644]: I0319 12:07:41.096945 17644 scope.go:117] "RemoveContainer" containerID="fba15ac5fd8638fa2d8fe5188431cf574d56ecf14fb3b1611a5b61dc6246db85" Mar 19 12:07:41.998795 master-0 kubenswrapper[17644]: I0319 12:07:41.998686 17644 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 12:07:41.998795 master-0 kubenswrapper[17644]: I0319 12:07:41.998780 17644 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="607a35c2a34325129014a178207e606c" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 12:07:41.999072 master-0 kubenswrapper[17644]: I0319 12:07:41.998829 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:07:41.999414 master-0 kubenswrapper[17644]: I0319 12:07:41.999379 17644 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"9d993cd1a5833f4165f6cc8d12eb8bf154b773e372dc7ee598d180665ac9fcd8"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 19 12:07:41.999478 master-0 kubenswrapper[17644]: I0319 12:07:41.999465 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="607a35c2a34325129014a178207e606c" containerName="cluster-policy-controller" containerID="cri-o://9d993cd1a5833f4165f6cc8d12eb8bf154b773e372dc7ee598d180665ac9fcd8" gracePeriod=30 Mar 19 12:07:42.104441 master-0 kubenswrapper[17644]: I0319 12:07:42.104361 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6f97756bc8-j7rc9_7a51eeaf-1349-4bf3-932d-22ed5ce7c161/control-plane-machine-set-operator/0.log" Mar 19 12:07:42.104917 master-0 kubenswrapper[17644]: I0319 12:07:42.104467 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-j7rc9" event={"ID":"7a51eeaf-1349-4bf3-932d-22ed5ce7c161","Type":"ContainerStarted","Data":"505ac4f0f5025287865214d4b36761ae01b775bccc6370c38882c3338940b8d2"} Mar 19 12:07:43.113398 master-0 kubenswrapper[17644]: I0319 12:07:43.113357 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_607a35c2a34325129014a178207e606c/cluster-policy-controller/1.log" Mar 19 12:07:43.115553 master-0 kubenswrapper[17644]: I0319 12:07:43.115424 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_607a35c2a34325129014a178207e606c/kube-controller-manager/0.log" Mar 19 12:07:43.115553 master-0 kubenswrapper[17644]: I0319 12:07:43.115468 17644 generic.go:334] "Generic (PLEG): container finished" podID="607a35c2a34325129014a178207e606c" containerID="9d993cd1a5833f4165f6cc8d12eb8bf154b773e372dc7ee598d180665ac9fcd8" exitCode=255 Mar 19 12:07:43.115553 master-0 kubenswrapper[17644]: I0319 12:07:43.115501 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"607a35c2a34325129014a178207e606c","Type":"ContainerDied","Data":"9d993cd1a5833f4165f6cc8d12eb8bf154b773e372dc7ee598d180665ac9fcd8"} Mar 19 12:07:43.115553 master-0 kubenswrapper[17644]: I0319 12:07:43.115537 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"607a35c2a34325129014a178207e606c","Type":"ContainerStarted","Data":"a1820a5c08e897a9a826fb2120795cc3a6c64a34860ea5d00dda9abbdf9766f3"} Mar 19 12:07:43.115553 master-0 kubenswrapper[17644]: I0319 12:07:43.115557 17644 scope.go:117] "RemoveContainer" containerID="f7c540b0641ec46c201cc061924c6ea67fd66d520e406a823de347faf358f648" Mar 19 12:07:44.122960 master-0 kubenswrapper[17644]: I0319 12:07:44.122910 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_607a35c2a34325129014a178207e606c/cluster-policy-controller/1.log" Mar 19 12:07:44.125211 master-0 kubenswrapper[17644]: I0319 12:07:44.125183 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_607a35c2a34325129014a178207e606c/kube-controller-manager/0.log" Mar 19 12:07:44.194500 master-0 kubenswrapper[17644]: E0319 12:07:44.194292 17644 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{downloads-66b8ffb895-vqnnc.189e3c9120d93c73 openshift-console 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-console,Name:downloads-66b8ffb895-vqnnc,UID:e17d22fe-fe0f-448e-9666-882d888d3ad4,APIVersion:v1,ResourceVersion:14739,FieldPath:spec.containers{download-server},},Reason:Created,Message:Created container: download-server,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 12:04:57.249586291 +0000 UTC m=+331.019544346,LastTimestamp:2026-03-19 12:04:57.249586291 +0000 UTC m=+331.019544346,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 12:07:44.948153 master-0 kubenswrapper[17644]: E0319 12:07:44.948080 17644 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:07:47.494201 master-0 kubenswrapper[17644]: E0319 12:07:47.494103 17644 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 19 12:07:48.998625 master-0 kubenswrapper[17644]: I0319 12:07:48.998455 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:07:48.998625 master-0 kubenswrapper[17644]: I0319 12:07:48.998553 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:07:51.851525 master-0 kubenswrapper[17644]: E0319 12:07:51.851463 17644 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 19 12:07:51.999224 master-0 kubenswrapper[17644]: I0319 12:07:51.999150 17644 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 12:07:51.999346 master-0 kubenswrapper[17644]: I0319 12:07:51.999268 17644 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="607a35c2a34325129014a178207e606c" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 12:07:52.206508 master-0 kubenswrapper[17644]: I0319 12:07:52.205969 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"f58f94dbf37560e9f8d10ac2adf65d4b7557e1239aa6f02c8cf7330d46305ca9"} Mar 19 12:07:53.217832 master-0 kubenswrapper[17644]: I0319 12:07:53.217665 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"7bf49bf97cafce5890ac848ad8c222728796ed708c172d4e4a4f006858cf58c6"} Mar 19 12:07:53.217832 master-0 kubenswrapper[17644]: I0319 12:07:53.217722 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"7a82d32427b81ef1136cc81cc4de8cc609942f5b1b477903e6759d5e1aef3f7a"} Mar 19 12:07:53.217832 master-0 kubenswrapper[17644]: I0319 12:07:53.217759 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"6b735fd07df1b7adf6fec5c7881c5c4048cc6a6a7f822a75f298022119fe3c9c"} Mar 19 12:07:53.217832 master-0 kubenswrapper[17644]: I0319 12:07:53.217768 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"be004d2f0d0a027fdab6e645bbd9e5be85d2305d78e8b79542f5670a553dfc29"} Mar 19 12:07:53.219107 master-0 kubenswrapper[17644]: I0319 12:07:53.218123 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="56189b49-7d09-40c4-b5ce-258ce79f391c" Mar 19 12:07:53.219107 master-0 kubenswrapper[17644]: I0319 12:07:53.218154 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="56189b49-7d09-40c4-b5ce-258ce79f391c" Mar 19 12:07:53.515775 master-0 kubenswrapper[17644]: I0319 12:07:53.515592 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 19 12:07:54.948864 master-0 kubenswrapper[17644]: E0319 12:07:54.948811 17644 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:07:54.948864 master-0 kubenswrapper[17644]: E0319 12:07:54.948845 17644 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 12:07:57.690948 master-0 kubenswrapper[17644]: I0319 12:07:57.690848 17644 status_manager.go:851] "Failed to get status for pod" podUID="8438d015-106b-4aed-ae12-dda781ce51fc" pod="openshift-network-node-identity/network-node-identity-j528w" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods network-node-identity-j528w)" Mar 19 12:07:58.515677 master-0 kubenswrapper[17644]: I0319 12:07:58.515596 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 19 12:08:01.271967 master-0 kubenswrapper[17644]: I0319 12:08:01.271911 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-764k4_d625c81e-01cc-424a-997d-546a5204a72b/snapshot-controller/3.log" Mar 19 12:08:01.272508 master-0 kubenswrapper[17644]: I0319 12:08:01.272318 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-764k4_d625c81e-01cc-424a-997d-546a5204a72b/snapshot-controller/2.log" Mar 19 12:08:01.272508 master-0 kubenswrapper[17644]: I0319 12:08:01.272352 17644 generic.go:334] "Generic (PLEG): container finished" podID="d625c81e-01cc-424a-997d-546a5204a72b" containerID="d7edf97138edd898b8970970ba2baab9570971be057da4099c12f0fae904c734" exitCode=1 Mar 19 12:08:01.272508 master-0 kubenswrapper[17644]: I0319 12:08:01.272379 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" event={"ID":"d625c81e-01cc-424a-997d-546a5204a72b","Type":"ContainerDied","Data":"d7edf97138edd898b8970970ba2baab9570971be057da4099c12f0fae904c734"} Mar 19 12:08:01.272508 master-0 kubenswrapper[17644]: I0319 12:08:01.272410 17644 scope.go:117] "RemoveContainer" containerID="4330a60e4005ff3453f41f9e734958fd9bc2e8f3531b9166c2303ff1d9076a60" Mar 19 12:08:01.272953 master-0 kubenswrapper[17644]: I0319 12:08:01.272925 17644 scope.go:117] "RemoveContainer" containerID="d7edf97138edd898b8970970ba2baab9570971be057da4099c12f0fae904c734" Mar 19 12:08:01.273181 master-0 kubenswrapper[17644]: E0319 12:08:01.273142 17644 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-764k4_openshift-cluster-storage-operator(d625c81e-01cc-424a-997d-546a5204a72b)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" podUID="d625c81e-01cc-424a-997d-546a5204a72b" Mar 19 12:08:01.999529 master-0 kubenswrapper[17644]: I0319 12:08:01.999434 17644 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 12:08:01.999529 master-0 kubenswrapper[17644]: I0319 12:08:01.999521 17644 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="607a35c2a34325129014a178207e606c" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 12:08:02.282522 master-0 kubenswrapper[17644]: I0319 12:08:02.282387 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-764k4_d625c81e-01cc-424a-997d-546a5204a72b/snapshot-controller/3.log" Mar 19 12:08:08.246375 master-0 kubenswrapper[17644]: I0319 12:08:08.246305 17644 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-etcd/etcd-master-0" Mar 19 12:08:08.271668 master-0 kubenswrapper[17644]: I0319 12:08:08.271520 17644 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-master-0" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56189b49-7d09-40c4-b5ce-258ce79f391c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T12:05:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T12:05:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [setup etcd-ensure-env-vars etcd-resources-copy]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T12:05:34Z\\\",\\\"message\\\":\\\"containers with unready status: [etcdctl etcd etcd-metrics etcd-readyz etcd-rev]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T12:05:34Z\\\",\\\"message\\\":\\\"containers with unready status: [etcdctl etcd etcd-metrics etcd-readyz etcd-rev]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}}},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Pending\\\"}}\" for pod \"openshift-etcd\"/\"etcd-master-0\": pods \"etcd-master-0\" not found" Mar 19 12:08:08.319853 master-0 kubenswrapper[17644]: I0319 12:08:08.319806 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="56189b49-7d09-40c4-b5ce-258ce79f391c" Mar 19 12:08:08.319853 master-0 kubenswrapper[17644]: I0319 12:08:08.319842 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="56189b49-7d09-40c4-b5ce-258ce79f391c" Mar 19 12:08:08.322233 master-0 kubenswrapper[17644]: I0319 12:08:08.322164 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 19 12:08:08.326301 master-0 kubenswrapper[17644]: I0319 12:08:08.326251 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 19 12:08:08.332814 master-0 kubenswrapper[17644]: I0319 12:08:08.332767 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 19 12:08:08.340323 master-0 kubenswrapper[17644]: I0319 12:08:08.340243 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-66b8ffb895-vqnnc" podStartSLOduration=192.692890666 podStartE2EDuration="3m57.340224976s" podCreationTimestamp="2026-03-19 12:04:11 +0000 UTC" firstStartedPulling="2026-03-19 12:04:12.445556705 +0000 UTC m=+286.215514740" lastFinishedPulling="2026-03-19 12:04:57.092891015 +0000 UTC m=+330.862849050" observedRunningTime="2026-03-19 12:08:08.32912614 +0000 UTC m=+522.099084195" watchObservedRunningTime="2026-03-19 12:08:08.340224976 +0000 UTC m=+522.110183011" Mar 19 12:08:08.438581 master-0 kubenswrapper[17644]: I0319 12:08:08.438503 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5"] Mar 19 12:08:08.447005 master-0 kubenswrapper[17644]: I0319 12:08:08.446946 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/metrics-server-64d6dd6b7b-xdrz5"] Mar 19 12:08:08.494029 master-0 kubenswrapper[17644]: I0319 12:08:08.493981 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f8c022c-7871-4765-971f-dcafa39357c9" path="/var/lib/kubelet/pods/5f8c022c-7871-4765-971f-dcafa39357c9/volumes" Mar 19 12:08:08.539819 master-0 kubenswrapper[17644]: I0319 12:08:08.539779 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 19 12:08:08.668373 master-0 kubenswrapper[17644]: I0319 12:08:08.668246 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=0.668217536 podStartE2EDuration="668.217536ms" podCreationTimestamp="2026-03-19 12:08:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:08:08.660159813 +0000 UTC m=+522.430117878" watchObservedRunningTime="2026-03-19 12:08:08.668217536 +0000 UTC m=+522.438175581" Mar 19 12:08:09.006698 master-0 kubenswrapper[17644]: I0319 12:08:09.006636 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:08:09.013792 master-0 kubenswrapper[17644]: I0319 12:08:09.013710 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:08:09.328856 master-0 kubenswrapper[17644]: I0319 12:08:09.327813 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="56189b49-7d09-40c4-b5ce-258ce79f391c" Mar 19 12:08:09.328856 master-0 kubenswrapper[17644]: I0319 12:08:09.327846 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="56189b49-7d09-40c4-b5ce-258ce79f391c" Mar 19 12:08:09.349138 master-0 kubenswrapper[17644]: I0319 12:08:09.349074 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 19 12:08:15.205778 master-0 kubenswrapper[17644]: E0319 12:08:15.205382 17644 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:08:05Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:08:05Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:08:05Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:08:05Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ddc5283caf2ced75a94ddf0e8a43c431889692007e8a875a187b25c35b45a9e2\\\"],\\\"sizeBytes\\\":2895807090},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:3ea089ab116e164d89b46dc077f87d9af22f525bc2d69403214f77ee3fd30161\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:d9cbffb5a2fd538c8f19b7174d2906286acdb37a574b9dce3f9da302074591ff\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746416849},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\\\"],\\\"sizeBytes\\\":1637455533},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:c9f7bbe4799eaacbfbb60eb906000d7a813a580d6a9740def7da774cbc4cf859\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cde1da53dadc54c24c10cab8fd3e67839ce68c33ec3b556c255a79167881966a\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252053726},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a55ec7ec64efd0f595d8084377b7e463a1807829b7617e5d4a9092dcd924c36\\\"],\\\"sizeBytes\\\":1238100502},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:aefc421cf2f5dba925f7c149d56ce14e910fbd969a4e22b5917fc912ca33a5b2\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:da1ee8c9ae2cb275833f329b3d793a9109915be16d938f208ec917b50d9dd66a\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1223644894},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:af0fe0ca926422a6471d5bf22fc0e682c36c24fba05496a3bdfac0b7d3733015\\\"],\\\"sizeBytes\\\":991832673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\"],\\\"sizeBytes\\\":943841779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b00c42562d477ef44d51f35950253a26d7debc7de86e53270831aafef5795c1\\\"],\\\"sizeBytes\\\":918289953},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f2c59d19eb73ad5c0f93b0a63003c1885f5297959c9c45b401d1a74aea6e76\\\"],\\\"sizeBytes\\\":880382887},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0a09f5a3ba4f60cce0145769509bab92553c8075d210af4ac058965d2ae11efa\\\"],\\\"sizeBytes\\\":876160834},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6bba3f73c0066e42b24839e0d29f5dce2f36436f0a11f9f5e1029bccc5ed6578\\\"],\\\"sizeBytes\\\":862657321},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:de91abd5ad76fb491881a75a0feb4b8ca5600ceb5e15a4b0b687ada01ea0a44c\\\"],\\\"sizeBytes\\\":862205633},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a9e8da5c6114f062b814936d4db7a47a04d248e160d6bb28ad4e4a081496ee4\\\"],\\\"sizeBytes\\\":772943435},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1e1faad2d9167d84e23585c1cea5962301845548043cf09578f943f79ca98016\\\"],\\\"sizeBytes\\\":687949580},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5e12e4dc52214d3ada5ba5106caebe079eac1d9292c2571a5fe83411ce8e900d\\\"],\\\"sizeBytes\\\":683195416},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa5e782406f71c048b1ac3a4bf5d1227ff4be81111114083ad4c7a209c6bfb5a\\\"],\\\"sizeBytes\\\":677942383},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5bbb8535e2496de8389585ebbe696e7d7b9bad2b27785ad8a30a0fc683b0a22d\\\"],\\\"sizeBytes\\\":633877280},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ec8fd46dfb35ed10e8f98933166f69ce579c2f35b8db03d21e4c34fc544553e4\\\"],\\\"sizeBytes\\\":621648710},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f3038df8df25746bb5095296d4e5740f2356f85c1ed8d43f1b3d281e294826e5\\\"],\\\"sizeBytes\\\":605698193},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998\\\"],\\\"sizeBytes\\\":589386806},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55\\\"],\\\"sizeBytes\\\":582154903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982\\\"],\\\"sizeBytes\\\":558211175},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:112a03f2411f871cdaca5f20daef71024dac710113d5f30897117a5a02f6b6f5\\\"],\\\"sizeBytes\\\":557428271},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"],\\\"sizeBytes\\\":548752816},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\"],\\\"sizeBytes\\\":529326739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b4c9cf268bb7abef7af187cd775d3f74d0bd33626250095428d53b705ee946\\\"],\\\"sizeBytes\\\":528956487},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\"],\\\"sizeBytes\\\":518384969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:59727c4b3fef19e5149675cf3350735bbfe2c6588a57654b2e4552dd719f58b1\\\"],\\\"sizeBytes\\\":517999161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\"],\\\"sizeBytes\\\":514984269},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:30a2f97d7785ce8b0ea5115e67c4554b64adefbc7856bcf6f4fe6cc7e938a310\\\"],\\\"sizeBytes\\\":513582374},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bfe394b58ec6195de8b8420e781b7630d85a412b9112d892fea903f92b783427\\\"],\\\"sizeBytes\\\":513221333},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"],\\\"sizeBytes\\\":512274055},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98bf5467a01195e20aeea7d6f0b130ddacc00b73bc5312253b8c34e7208538f8\\\"],\\\"sizeBytes\\\":512235769},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:77fff570657d2fa0bfb709b2c8b6665bae0bf90a2be981d8dbca56c674715098\\\"],\\\"sizeBytes\\\":511227324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adb9f6f2fd701863c7caed747df43f83d3569ba9388cfa33ea7219ac6a606b11\\\"],\\\"sizeBytes\\\":511164375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458\\\"],\\\"sizeBytes\\\":508888171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263\\\"],\\\"sizeBytes\\\":508544745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71\\\"],\\\"sizeBytes\\\":507972093},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3\\\"],\\\"sizeBytes\\\":506480167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\"],\\\"sizeBytes\\\":506395599},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7d4a034950346bcd4e36e9e2f1343e0cf7a10cf544963f33d09c7eb2a1bfc634\\\"],\\\"sizeBytes\\\":505345991},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\\\"],\\\"sizeBytes\\\":505246690},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1973d56a1097a48ea0ebf2c4dbae1ed86fa67bb0116f4962f7720d48aa337d27\\\"],\\\"sizeBytes\\\":504662731},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\\\"],\\\"sizeBytes\\\":504625081},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf72297fee61ec9950f6868881ad3e84be8692ca08f084b3d155d93a766c0823\\\"],\\\"sizeBytes\\\":502712961},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69\\\"],\\\"sizeBytes\\\":495994673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023\\\"],\\\"sizeBytes\\\":495065340},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:002dfb86e17ad8f5cc232a7d2dce183b23335c8ecb7e7d31dcf3e4446b390777\\\"],\\\"sizeBytes\\\":487159945}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:08:16.489180 master-0 kubenswrapper[17644]: I0319 12:08:16.489116 17644 scope.go:117] "RemoveContainer" containerID="d7edf97138edd898b8970970ba2baab9570971be057da4099c12f0fae904c734" Mar 19 12:08:16.489870 master-0 kubenswrapper[17644]: E0319 12:08:16.489460 17644 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-764k4_openshift-cluster-storage-operator(d625c81e-01cc-424a-997d-546a5204a72b)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" podUID="d625c81e-01cc-424a-997d-546a5204a72b" Mar 19 12:08:18.197214 master-0 kubenswrapper[17644]: E0319 12:08:18.196977 17644 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{downloads-66b8ffb895-vqnnc.189e3c91218bc7c9 openshift-console 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-console,Name:downloads-66b8ffb895-vqnnc,UID:e17d22fe-fe0f-448e-9666-882d888d3ad4,APIVersion:v1,ResourceVersion:14739,FieldPath:spec.containers{download-server},},Reason:Started,Message:Started container download-server,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 12:04:57.261287369 +0000 UTC m=+331.031245414,LastTimestamp:2026-03-19 12:04:57.261287369 +0000 UTC m=+331.031245414,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 12:08:24.233139 master-0 kubenswrapper[17644]: I0319 12:08:24.233081 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 19 12:08:24.235426 master-0 kubenswrapper[17644]: E0319 12:08:24.235388 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff9f91a-2293-4b2d-95dd-be0f9152984e" containerName="installer" Mar 19 12:08:24.235426 master-0 kubenswrapper[17644]: I0319 12:08:24.235422 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff9f91a-2293-4b2d-95dd-be0f9152984e" containerName="installer" Mar 19 12:08:24.235547 master-0 kubenswrapper[17644]: E0319 12:08:24.235445 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f8c022c-7871-4765-971f-dcafa39357c9" containerName="metrics-server" Mar 19 12:08:24.235547 master-0 kubenswrapper[17644]: I0319 12:08:24.235454 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f8c022c-7871-4765-971f-dcafa39357c9" containerName="metrics-server" Mar 19 12:08:24.235547 master-0 kubenswrapper[17644]: E0319 12:08:24.235481 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ec000c4-5cc8-45b3-95ba-2856655f02f5" containerName="installer" Mar 19 12:08:24.235547 master-0 kubenswrapper[17644]: I0319 12:08:24.235492 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ec000c4-5cc8-45b3-95ba-2856655f02f5" containerName="installer" Mar 19 12:08:24.235922 master-0 kubenswrapper[17644]: I0319 12:08:24.235897 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="dff9f91a-2293-4b2d-95dd-be0f9152984e" containerName="installer" Mar 19 12:08:24.235985 master-0 kubenswrapper[17644]: I0319 12:08:24.235954 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f8c022c-7871-4765-971f-dcafa39357c9" containerName="metrics-server" Mar 19 12:08:24.236033 master-0 kubenswrapper[17644]: I0319 12:08:24.236019 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ec000c4-5cc8-45b3-95ba-2856655f02f5" containerName="installer" Mar 19 12:08:24.236886 master-0 kubenswrapper[17644]: I0319 12:08:24.236859 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 12:08:24.244303 master-0 kubenswrapper[17644]: I0319 12:08:24.244236 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-4vwst" Mar 19 12:08:24.244527 master-0 kubenswrapper[17644]: I0319 12:08:24.244385 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 19 12:08:24.259007 master-0 kubenswrapper[17644]: I0319 12:08:24.258907 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e40539b3-c74d-45b8-8526-d25a3a41c336-var-lock\") pod \"installer-4-master-0\" (UID: \"e40539b3-c74d-45b8-8526-d25a3a41c336\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 12:08:24.259007 master-0 kubenswrapper[17644]: I0319 12:08:24.258976 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e40539b3-c74d-45b8-8526-d25a3a41c336-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"e40539b3-c74d-45b8-8526-d25a3a41c336\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 12:08:24.259242 master-0 kubenswrapper[17644]: I0319 12:08:24.259050 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e40539b3-c74d-45b8-8526-d25a3a41c336-kube-api-access\") pod \"installer-4-master-0\" (UID: \"e40539b3-c74d-45b8-8526-d25a3a41c336\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 12:08:24.259294 master-0 kubenswrapper[17644]: I0319 12:08:24.259265 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 19 12:08:24.360184 master-0 kubenswrapper[17644]: I0319 12:08:24.360106 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e40539b3-c74d-45b8-8526-d25a3a41c336-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"e40539b3-c74d-45b8-8526-d25a3a41c336\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 12:08:24.360395 master-0 kubenswrapper[17644]: I0319 12:08:24.360292 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e40539b3-c74d-45b8-8526-d25a3a41c336-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"e40539b3-c74d-45b8-8526-d25a3a41c336\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 12:08:24.360519 master-0 kubenswrapper[17644]: I0319 12:08:24.360443 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e40539b3-c74d-45b8-8526-d25a3a41c336-kube-api-access\") pod \"installer-4-master-0\" (UID: \"e40539b3-c74d-45b8-8526-d25a3a41c336\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 12:08:24.360762 master-0 kubenswrapper[17644]: I0319 12:08:24.360716 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e40539b3-c74d-45b8-8526-d25a3a41c336-var-lock\") pod \"installer-4-master-0\" (UID: \"e40539b3-c74d-45b8-8526-d25a3a41c336\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 12:08:24.360829 master-0 kubenswrapper[17644]: I0319 12:08:24.360806 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e40539b3-c74d-45b8-8526-d25a3a41c336-var-lock\") pod \"installer-4-master-0\" (UID: \"e40539b3-c74d-45b8-8526-d25a3a41c336\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 12:08:24.832871 master-0 kubenswrapper[17644]: I0319 12:08:24.832777 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-6-retry-1-master-0"] Mar 19 12:08:24.834079 master-0 kubenswrapper[17644]: I0319 12:08:24.833977 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-retry-1-master-0" Mar 19 12:08:24.842095 master-0 kubenswrapper[17644]: I0319 12:08:24.841476 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 12:08:24.842095 master-0 kubenswrapper[17644]: I0319 12:08:24.841732 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-kpv7f" Mar 19 12:08:24.853516 master-0 kubenswrapper[17644]: I0319 12:08:24.853446 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-6-retry-1-master-0"] Mar 19 12:08:24.968460 master-0 kubenswrapper[17644]: I0319 12:08:24.968387 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8128555b-e19b-4259-acb7-b54f350850d0-var-lock\") pod \"installer-6-retry-1-master-0\" (UID: \"8128555b-e19b-4259-acb7-b54f350850d0\") " pod="openshift-kube-apiserver/installer-6-retry-1-master-0" Mar 19 12:08:24.968460 master-0 kubenswrapper[17644]: I0319 12:08:24.968446 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8128555b-e19b-4259-acb7-b54f350850d0-kube-api-access\") pod \"installer-6-retry-1-master-0\" (UID: \"8128555b-e19b-4259-acb7-b54f350850d0\") " pod="openshift-kube-apiserver/installer-6-retry-1-master-0" Mar 19 12:08:24.968768 master-0 kubenswrapper[17644]: I0319 12:08:24.968501 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8128555b-e19b-4259-acb7-b54f350850d0-kubelet-dir\") pod \"installer-6-retry-1-master-0\" (UID: \"8128555b-e19b-4259-acb7-b54f350850d0\") " pod="openshift-kube-apiserver/installer-6-retry-1-master-0" Mar 19 12:08:25.070205 master-0 kubenswrapper[17644]: I0319 12:08:25.070103 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8128555b-e19b-4259-acb7-b54f350850d0-var-lock\") pod \"installer-6-retry-1-master-0\" (UID: \"8128555b-e19b-4259-acb7-b54f350850d0\") " pod="openshift-kube-apiserver/installer-6-retry-1-master-0" Mar 19 12:08:25.070205 master-0 kubenswrapper[17644]: I0319 12:08:25.070193 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8128555b-e19b-4259-acb7-b54f350850d0-var-lock\") pod \"installer-6-retry-1-master-0\" (UID: \"8128555b-e19b-4259-acb7-b54f350850d0\") " pod="openshift-kube-apiserver/installer-6-retry-1-master-0" Mar 19 12:08:25.070205 master-0 kubenswrapper[17644]: I0319 12:08:25.070230 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8128555b-e19b-4259-acb7-b54f350850d0-kube-api-access\") pod \"installer-6-retry-1-master-0\" (UID: \"8128555b-e19b-4259-acb7-b54f350850d0\") " pod="openshift-kube-apiserver/installer-6-retry-1-master-0" Mar 19 12:08:25.070675 master-0 kubenswrapper[17644]: I0319 12:08:25.070359 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8128555b-e19b-4259-acb7-b54f350850d0-kubelet-dir\") pod \"installer-6-retry-1-master-0\" (UID: \"8128555b-e19b-4259-acb7-b54f350850d0\") " pod="openshift-kube-apiserver/installer-6-retry-1-master-0" Mar 19 12:08:25.070675 master-0 kubenswrapper[17644]: I0319 12:08:25.070442 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8128555b-e19b-4259-acb7-b54f350850d0-kubelet-dir\") pod \"installer-6-retry-1-master-0\" (UID: \"8128555b-e19b-4259-acb7-b54f350850d0\") " pod="openshift-kube-apiserver/installer-6-retry-1-master-0" Mar 19 12:08:25.206533 master-0 kubenswrapper[17644]: E0319 12:08:25.206442 17644 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:08:26.432632 master-0 kubenswrapper[17644]: I0319 12:08:26.432553 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 19 12:08:26.433709 master-0 kubenswrapper[17644]: I0319 12:08:26.433688 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 12:08:26.436414 master-0 kubenswrapper[17644]: I0319 12:08:26.436375 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 12:08:26.437350 master-0 kubenswrapper[17644]: I0319 12:08:26.437301 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-g6g26" Mar 19 12:08:26.467801 master-0 kubenswrapper[17644]: I0319 12:08:26.467348 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 19 12:08:26.595242 master-0 kubenswrapper[17644]: I0319 12:08:26.595184 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a918e1b-bc1e-4947-b33b-bb346a4221c2-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"3a918e1b-bc1e-4947-b33b-bb346a4221c2\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 12:08:26.596256 master-0 kubenswrapper[17644]: I0319 12:08:26.596188 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3a918e1b-bc1e-4947-b33b-bb346a4221c2-var-lock\") pod \"installer-3-master-0\" (UID: \"3a918e1b-bc1e-4947-b33b-bb346a4221c2\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 12:08:26.596341 master-0 kubenswrapper[17644]: I0319 12:08:26.596259 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3a918e1b-bc1e-4947-b33b-bb346a4221c2-kube-api-access\") pod \"installer-3-master-0\" (UID: \"3a918e1b-bc1e-4947-b33b-bb346a4221c2\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 12:08:26.698141 master-0 kubenswrapper[17644]: I0319 12:08:26.698078 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3a918e1b-bc1e-4947-b33b-bb346a4221c2-var-lock\") pod \"installer-3-master-0\" (UID: \"3a918e1b-bc1e-4947-b33b-bb346a4221c2\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 12:08:26.698141 master-0 kubenswrapper[17644]: I0319 12:08:26.698144 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3a918e1b-bc1e-4947-b33b-bb346a4221c2-kube-api-access\") pod \"installer-3-master-0\" (UID: \"3a918e1b-bc1e-4947-b33b-bb346a4221c2\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 12:08:26.698516 master-0 kubenswrapper[17644]: I0319 12:08:26.698446 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a918e1b-bc1e-4947-b33b-bb346a4221c2-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"3a918e1b-bc1e-4947-b33b-bb346a4221c2\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 12:08:26.698679 master-0 kubenswrapper[17644]: I0319 12:08:26.698642 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a918e1b-bc1e-4947-b33b-bb346a4221c2-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"3a918e1b-bc1e-4947-b33b-bb346a4221c2\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 12:08:26.698938 master-0 kubenswrapper[17644]: I0319 12:08:26.698907 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3a918e1b-bc1e-4947-b33b-bb346a4221c2-var-lock\") pod \"installer-3-master-0\" (UID: \"3a918e1b-bc1e-4947-b33b-bb346a4221c2\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 12:08:26.870119 master-0 kubenswrapper[17644]: I0319 12:08:26.869911 17644 scope.go:117] "RemoveContainer" containerID="fa33151970d752ef2161babaa56491652362bb6f1d5e173d5390c7f59b36f27d" Mar 19 12:08:28.497754 master-0 kubenswrapper[17644]: I0319 12:08:28.497681 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:08:31.484394 master-0 kubenswrapper[17644]: I0319 12:08:31.484346 17644 scope.go:117] "RemoveContainer" containerID="d7edf97138edd898b8970970ba2baab9570971be057da4099c12f0fae904c734" Mar 19 12:08:32.526249 master-0 kubenswrapper[17644]: I0319 12:08:32.526175 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-764k4_d625c81e-01cc-424a-997d-546a5204a72b/snapshot-controller/3.log" Mar 19 12:08:32.526249 master-0 kubenswrapper[17644]: I0319 12:08:32.526249 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" event={"ID":"d625c81e-01cc-424a-997d-546a5204a72b","Type":"ContainerStarted","Data":"929340400aba9fe4e6bdc1b44a88bcc955ba57855b3d7ae839aa6c55c2194e53"} Mar 19 12:08:34.229059 master-0 kubenswrapper[17644]: I0319 12:08:34.228970 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-6-retry-1-master-0"] Mar 19 12:08:34.229635 master-0 kubenswrapper[17644]: E0319 12:08:34.229554 17644 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-kube-apiserver/installer-6-retry-1-master-0" podUID="8128555b-e19b-4259-acb7-b54f350850d0" Mar 19 12:08:34.542006 master-0 kubenswrapper[17644]: I0319 12:08:34.541862 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-retry-1-master-0" Mar 19 12:08:34.555341 master-0 kubenswrapper[17644]: I0319 12:08:34.555298 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-retry-1-master-0" Mar 19 12:08:34.647769 master-0 kubenswrapper[17644]: I0319 12:08:34.647679 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8128555b-e19b-4259-acb7-b54f350850d0-var-lock\") pod \"8128555b-e19b-4259-acb7-b54f350850d0\" (UID: \"8128555b-e19b-4259-acb7-b54f350850d0\") " Mar 19 12:08:34.648197 master-0 kubenswrapper[17644]: I0319 12:08:34.647818 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8128555b-e19b-4259-acb7-b54f350850d0-kubelet-dir\") pod \"8128555b-e19b-4259-acb7-b54f350850d0\" (UID: \"8128555b-e19b-4259-acb7-b54f350850d0\") " Mar 19 12:08:34.648197 master-0 kubenswrapper[17644]: I0319 12:08:34.647871 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8128555b-e19b-4259-acb7-b54f350850d0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8128555b-e19b-4259-acb7-b54f350850d0" (UID: "8128555b-e19b-4259-acb7-b54f350850d0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:08:34.648197 master-0 kubenswrapper[17644]: I0319 12:08:34.647888 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8128555b-e19b-4259-acb7-b54f350850d0-var-lock" (OuterVolumeSpecName: "var-lock") pod "8128555b-e19b-4259-acb7-b54f350850d0" (UID: "8128555b-e19b-4259-acb7-b54f350850d0"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:08:34.648858 master-0 kubenswrapper[17644]: I0319 12:08:34.648821 17644 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8128555b-e19b-4259-acb7-b54f350850d0-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:08:34.648858 master-0 kubenswrapper[17644]: I0319 12:08:34.648852 17644 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8128555b-e19b-4259-acb7-b54f350850d0-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:08:35.207259 master-0 kubenswrapper[17644]: E0319 12:08:35.207145 17644 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:08:35.549488 master-0 kubenswrapper[17644]: I0319 12:08:35.549363 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-retry-1-master-0" Mar 19 12:08:35.615351 master-0 kubenswrapper[17644]: I0319 12:08:35.615262 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-6-retry-1-master-0"] Mar 19 12:08:35.624283 master-0 kubenswrapper[17644]: I0319 12:08:35.624237 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-6-retry-1-master-0"] Mar 19 12:08:36.491740 master-0 kubenswrapper[17644]: I0319 12:08:36.491654 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8128555b-e19b-4259-acb7-b54f350850d0" path="/var/lib/kubelet/pods/8128555b-e19b-4259-acb7-b54f350850d0/volumes" Mar 19 12:08:36.824689 master-0 kubenswrapper[17644]: I0319 12:08:36.824558 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-7-master-0"] Mar 19 12:08:36.833819 master-0 kubenswrapper[17644]: I0319 12:08:36.826047 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 12:08:36.833819 master-0 kubenswrapper[17644]: I0319 12:08:36.829930 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 12:08:36.833819 master-0 kubenswrapper[17644]: I0319 12:08:36.830211 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-kpv7f" Mar 19 12:08:36.833819 master-0 kubenswrapper[17644]: I0319 12:08:36.832588 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-7-master-0"] Mar 19 12:08:36.883788 master-0 kubenswrapper[17644]: I0319 12:08:36.883691 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7def3099-f487-44d4-a1d5-2ae096ef8804-kube-api-access\") pod \"installer-7-master-0\" (UID: \"7def3099-f487-44d4-a1d5-2ae096ef8804\") " pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 12:08:36.884037 master-0 kubenswrapper[17644]: I0319 12:08:36.883955 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7def3099-f487-44d4-a1d5-2ae096ef8804-var-lock\") pod \"installer-7-master-0\" (UID: \"7def3099-f487-44d4-a1d5-2ae096ef8804\") " pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 12:08:36.884373 master-0 kubenswrapper[17644]: I0319 12:08:36.884305 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7def3099-f487-44d4-a1d5-2ae096ef8804-kubelet-dir\") pod \"installer-7-master-0\" (UID: \"7def3099-f487-44d4-a1d5-2ae096ef8804\") " pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 12:08:36.987094 master-0 kubenswrapper[17644]: I0319 12:08:36.987005 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7def3099-f487-44d4-a1d5-2ae096ef8804-kube-api-access\") pod \"installer-7-master-0\" (UID: \"7def3099-f487-44d4-a1d5-2ae096ef8804\") " pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 12:08:36.987428 master-0 kubenswrapper[17644]: I0319 12:08:36.987410 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7def3099-f487-44d4-a1d5-2ae096ef8804-var-lock\") pod \"installer-7-master-0\" (UID: \"7def3099-f487-44d4-a1d5-2ae096ef8804\") " pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 12:08:36.987624 master-0 kubenswrapper[17644]: I0319 12:08:36.987558 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7def3099-f487-44d4-a1d5-2ae096ef8804-var-lock\") pod \"installer-7-master-0\" (UID: \"7def3099-f487-44d4-a1d5-2ae096ef8804\") " pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 12:08:36.987624 master-0 kubenswrapper[17644]: I0319 12:08:36.987592 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7def3099-f487-44d4-a1d5-2ae096ef8804-kubelet-dir\") pod \"installer-7-master-0\" (UID: \"7def3099-f487-44d4-a1d5-2ae096ef8804\") " pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 12:08:36.987840 master-0 kubenswrapper[17644]: I0319 12:08:36.987819 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7def3099-f487-44d4-a1d5-2ae096ef8804-kubelet-dir\") pod \"installer-7-master-0\" (UID: \"7def3099-f487-44d4-a1d5-2ae096ef8804\") " pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 12:08:45.207658 master-0 kubenswrapper[17644]: E0319 12:08:45.207595 17644 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:08:52.201497 master-0 kubenswrapper[17644]: E0319 12:08:52.201213 17644 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event=< Mar 19 12:08:52.201497 master-0 kubenswrapper[17644]: &Event{ObjectMeta:{downloads-66b8ffb895-vqnnc.189e3c913a9a2bc4 openshift-console 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-console,Name:downloads-66b8ffb895-vqnnc,UID:e17d22fe-fe0f-448e-9666-882d888d3ad4,APIVersion:v1,ResourceVersion:14739,FieldPath:spec.containers{download-server},},Reason:ProbeError,Message:Readiness probe error: Get "http://10.128.0.95:8080/": dial tcp 10.128.0.95:8080: connect: connection refused Mar 19 12:08:52.201497 master-0 kubenswrapper[17644]: body: Mar 19 12:08:52.201497 master-0 kubenswrapper[17644]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 12:04:57.681660868 +0000 UTC m=+331.451618913,LastTimestamp:2026-03-19 12:04:57.681660868 +0000 UTC m=+331.451618913,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Mar 19 12:08:52.201497 master-0 kubenswrapper[17644]: > Mar 19 12:08:55.208069 master-0 kubenswrapper[17644]: E0319 12:08:55.207977 17644 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:08:55.208069 master-0 kubenswrapper[17644]: E0319 12:08:55.208031 17644 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 12:08:58.364765 master-0 kubenswrapper[17644]: I0319 12:08:58.364682 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 19 12:08:58.373066 master-0 kubenswrapper[17644]: E0319 12:08:58.372967 17644 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-scheduler/installer-4-master-0: failed to fetch token: Timeout: request did not complete within requested timeout - context deadline exceeded Mar 19 12:08:58.373066 master-0 kubenswrapper[17644]: E0319 12:08:58.373059 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e40539b3-c74d-45b8-8526-d25a3a41c336-kube-api-access podName:e40539b3-c74d-45b8-8526-d25a3a41c336 nodeName:}" failed. No retries permitted until 2026-03-19 12:08:58.873036913 +0000 UTC m=+572.642994948 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/e40539b3-c74d-45b8-8526-d25a3a41c336-kube-api-access") pod "installer-4-master-0" (UID: "e40539b3-c74d-45b8-8526-d25a3a41c336") : failed to fetch token: Timeout: request did not complete within requested timeout - context deadline exceeded Mar 19 12:08:58.879945 master-0 kubenswrapper[17644]: I0319 12:08:58.879891 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e40539b3-c74d-45b8-8526-d25a3a41c336-kube-api-access\") pod \"installer-4-master-0\" (UID: \"e40539b3-c74d-45b8-8526-d25a3a41c336\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 12:08:59.073410 master-0 kubenswrapper[17644]: E0319 12:08:59.073346 17644 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-6-retry-1-master-0: failed to fetch token: Timeout: request did not complete within requested timeout - context deadline exceeded Mar 19 12:08:59.073672 master-0 kubenswrapper[17644]: E0319 12:08:59.073438 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8128555b-e19b-4259-acb7-b54f350850d0-kube-api-access podName:8128555b-e19b-4259-acb7-b54f350850d0 nodeName:}" failed. No retries permitted until 2026-03-19 12:08:59.573416066 +0000 UTC m=+573.343374121 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/8128555b-e19b-4259-acb7-b54f350850d0-kube-api-access") pod "installer-6-retry-1-master-0" (UID: "8128555b-e19b-4259-acb7-b54f350850d0") : failed to fetch token: Timeout: request did not complete within requested timeout - context deadline exceeded Mar 19 12:08:59.083587 master-0 kubenswrapper[17644]: I0319 12:08:59.083500 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8128555b-e19b-4259-acb7-b54f350850d0-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:00.700819 master-0 kubenswrapper[17644]: E0319 12:09:00.700672 17644 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-controller-manager/installer-3-master-0: failed to fetch token: Timeout: request did not complete within requested timeout - context deadline exceeded Mar 19 12:09:00.701390 master-0 kubenswrapper[17644]: E0319 12:09:00.700859 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a918e1b-bc1e-4947-b33b-bb346a4221c2-kube-api-access podName:3a918e1b-bc1e-4947-b33b-bb346a4221c2 nodeName:}" failed. No retries permitted until 2026-03-19 12:09:01.200814252 +0000 UTC m=+574.970772327 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/3a918e1b-bc1e-4947-b33b-bb346a4221c2-kube-api-access") pod "installer-3-master-0" (UID: "3a918e1b-bc1e-4947-b33b-bb346a4221c2") : failed to fetch token: Timeout: request did not complete within requested timeout - context deadline exceeded Mar 19 12:09:01.219022 master-0 kubenswrapper[17644]: I0319 12:09:01.218961 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3a918e1b-bc1e-4947-b33b-bb346a4221c2-kube-api-access\") pod \"installer-3-master-0\" (UID: \"3a918e1b-bc1e-4947-b33b-bb346a4221c2\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 12:09:01.743210 master-0 kubenswrapper[17644]: I0319 12:09:01.743165 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager_controller-manager-b485796d4-dqrfs_97f5b7e8-eee9-42b1-a23e-8a74f1ce4585/controller-manager/1.log" Mar 19 12:09:01.743720 master-0 kubenswrapper[17644]: I0319 12:09:01.743591 17644 generic.go:334] "Generic (PLEG): container finished" podID="97f5b7e8-eee9-42b1-a23e-8a74f1ce4585" containerID="89e2fb0436aaf5420411b2928148d341d7b6c5d6dab580ec01bca9d61919925b" exitCode=255 Mar 19 12:09:01.743720 master-0 kubenswrapper[17644]: I0319 12:09:01.743649 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" event={"ID":"97f5b7e8-eee9-42b1-a23e-8a74f1ce4585","Type":"ContainerDied","Data":"89e2fb0436aaf5420411b2928148d341d7b6c5d6dab580ec01bca9d61919925b"} Mar 19 12:09:01.743720 master-0 kubenswrapper[17644]: I0319 12:09:01.743681 17644 scope.go:117] "RemoveContainer" containerID="6ed56431e7e3a29594e8c55d24af97e05dc53fc52776fe94fedb9d579e864bcd" Mar 19 12:09:01.744824 master-0 kubenswrapper[17644]: I0319 12:09:01.744622 17644 scope.go:117] "RemoveContainer" containerID="89e2fb0436aaf5420411b2928148d341d7b6c5d6dab580ec01bca9d61919925b" Mar 19 12:09:01.745129 master-0 kubenswrapper[17644]: E0319 12:09:01.745084 17644 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=controller-manager pod=controller-manager-b485796d4-dqrfs_openshift-controller-manager(97f5b7e8-eee9-42b1-a23e-8a74f1ce4585)\"" pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" podUID="97f5b7e8-eee9-42b1-a23e-8a74f1ce4585" Mar 19 12:09:01.745550 master-0 kubenswrapper[17644]: I0319 12:09:01.745515 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-764k4_d625c81e-01cc-424a-997d-546a5204a72b/snapshot-controller/4.log" Mar 19 12:09:01.746203 master-0 kubenswrapper[17644]: I0319 12:09:01.746175 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-764k4_d625c81e-01cc-424a-997d-546a5204a72b/snapshot-controller/3.log" Mar 19 12:09:01.746273 master-0 kubenswrapper[17644]: I0319 12:09:01.746226 17644 generic.go:334] "Generic (PLEG): container finished" podID="d625c81e-01cc-424a-997d-546a5204a72b" containerID="929340400aba9fe4e6bdc1b44a88bcc955ba57855b3d7ae839aa6c55c2194e53" exitCode=1 Mar 19 12:09:01.746273 master-0 kubenswrapper[17644]: I0319 12:09:01.746256 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" event={"ID":"d625c81e-01cc-424a-997d-546a5204a72b","Type":"ContainerDied","Data":"929340400aba9fe4e6bdc1b44a88bcc955ba57855b3d7ae839aa6c55c2194e53"} Mar 19 12:09:01.746754 master-0 kubenswrapper[17644]: I0319 12:09:01.746711 17644 scope.go:117] "RemoveContainer" containerID="929340400aba9fe4e6bdc1b44a88bcc955ba57855b3d7ae839aa6c55c2194e53" Mar 19 12:09:01.746937 master-0 kubenswrapper[17644]: E0319 12:09:01.746916 17644 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-764k4_openshift-cluster-storage-operator(d625c81e-01cc-424a-997d-546a5204a72b)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" podUID="d625c81e-01cc-424a-997d-546a5204a72b" Mar 19 12:09:01.780770 master-0 kubenswrapper[17644]: I0319 12:09:01.779423 17644 scope.go:117] "RemoveContainer" containerID="d7edf97138edd898b8970970ba2baab9570971be057da4099c12f0fae904c734" Mar 19 12:09:02.753899 master-0 kubenswrapper[17644]: I0319 12:09:02.753845 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-764k4_d625c81e-01cc-424a-997d-546a5204a72b/snapshot-controller/4.log" Mar 19 12:09:02.755367 master-0 kubenswrapper[17644]: I0319 12:09:02.755347 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager_controller-manager-b485796d4-dqrfs_97f5b7e8-eee9-42b1-a23e-8a74f1ce4585/controller-manager/1.log" Mar 19 12:09:07.295097 master-0 kubenswrapper[17644]: I0319 12:09:07.294865 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 19 12:09:07.296284 master-0 kubenswrapper[17644]: E0319 12:09:07.295861 17644 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-kube-controller-manager/installer-3-master-0" podUID="3a918e1b-bc1e-4947-b33b-bb346a4221c2" Mar 19 12:09:07.480861 master-0 kubenswrapper[17644]: I0319 12:09:07.480800 17644 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" Mar 19 12:09:07.481286 master-0 kubenswrapper[17644]: I0319 12:09:07.481269 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" Mar 19 12:09:07.483263 master-0 kubenswrapper[17644]: I0319 12:09:07.483213 17644 scope.go:117] "RemoveContainer" containerID="89e2fb0436aaf5420411b2928148d341d7b6c5d6dab580ec01bca9d61919925b" Mar 19 12:09:07.483548 master-0 kubenswrapper[17644]: E0319 12:09:07.483500 17644 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=controller-manager pod=controller-manager-b485796d4-dqrfs_openshift-controller-manager(97f5b7e8-eee9-42b1-a23e-8a74f1ce4585)\"" pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" podUID="97f5b7e8-eee9-42b1-a23e-8a74f1ce4585" Mar 19 12:09:07.788006 master-0 kubenswrapper[17644]: I0319 12:09:07.787890 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 12:09:07.789323 master-0 kubenswrapper[17644]: I0319 12:09:07.789264 17644 scope.go:117] "RemoveContainer" containerID="89e2fb0436aaf5420411b2928148d341d7b6c5d6dab580ec01bca9d61919925b" Mar 19 12:09:07.789779 master-0 kubenswrapper[17644]: E0319 12:09:07.789673 17644 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=controller-manager pod=controller-manager-b485796d4-dqrfs_openshift-controller-manager(97f5b7e8-eee9-42b1-a23e-8a74f1ce4585)\"" pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" podUID="97f5b7e8-eee9-42b1-a23e-8a74f1ce4585" Mar 19 12:09:07.802248 master-0 kubenswrapper[17644]: I0319 12:09:07.802200 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 12:09:07.935498 master-0 kubenswrapper[17644]: I0319 12:09:07.935418 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3a918e1b-bc1e-4947-b33b-bb346a4221c2-var-lock\") pod \"3a918e1b-bc1e-4947-b33b-bb346a4221c2\" (UID: \"3a918e1b-bc1e-4947-b33b-bb346a4221c2\") " Mar 19 12:09:07.935757 master-0 kubenswrapper[17644]: I0319 12:09:07.935535 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a918e1b-bc1e-4947-b33b-bb346a4221c2-kubelet-dir\") pod \"3a918e1b-bc1e-4947-b33b-bb346a4221c2\" (UID: \"3a918e1b-bc1e-4947-b33b-bb346a4221c2\") " Mar 19 12:09:07.935757 master-0 kubenswrapper[17644]: I0319 12:09:07.935580 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a918e1b-bc1e-4947-b33b-bb346a4221c2-var-lock" (OuterVolumeSpecName: "var-lock") pod "3a918e1b-bc1e-4947-b33b-bb346a4221c2" (UID: "3a918e1b-bc1e-4947-b33b-bb346a4221c2"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:09:07.935888 master-0 kubenswrapper[17644]: I0319 12:09:07.935811 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a918e1b-bc1e-4947-b33b-bb346a4221c2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3a918e1b-bc1e-4947-b33b-bb346a4221c2" (UID: "3a918e1b-bc1e-4947-b33b-bb346a4221c2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:09:07.936799 master-0 kubenswrapper[17644]: I0319 12:09:07.936752 17644 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3a918e1b-bc1e-4947-b33b-bb346a4221c2-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:07.936862 master-0 kubenswrapper[17644]: I0319 12:09:07.936808 17644 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a918e1b-bc1e-4947-b33b-bb346a4221c2-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:08.302029 master-0 kubenswrapper[17644]: I0319 12:09:08.301962 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 19 12:09:08.302958 master-0 kubenswrapper[17644]: I0319 12:09:08.302936 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 12:09:08.310106 master-0 kubenswrapper[17644]: I0319 12:09:08.310060 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 19 12:09:08.444601 master-0 kubenswrapper[17644]: I0319 12:09:08.444553 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/47d6a091-6854-4e44-8e7c-b2089cae286c-var-lock\") pod \"installer-4-master-0\" (UID: \"47d6a091-6854-4e44-8e7c-b2089cae286c\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 12:09:08.444916 master-0 kubenswrapper[17644]: I0319 12:09:08.444895 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47d6a091-6854-4e44-8e7c-b2089cae286c-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"47d6a091-6854-4e44-8e7c-b2089cae286c\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 12:09:08.445104 master-0 kubenswrapper[17644]: I0319 12:09:08.445083 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47d6a091-6854-4e44-8e7c-b2089cae286c-kube-api-access\") pod \"installer-4-master-0\" (UID: \"47d6a091-6854-4e44-8e7c-b2089cae286c\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 12:09:08.546415 master-0 kubenswrapper[17644]: I0319 12:09:08.546352 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/47d6a091-6854-4e44-8e7c-b2089cae286c-var-lock\") pod \"installer-4-master-0\" (UID: \"47d6a091-6854-4e44-8e7c-b2089cae286c\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 12:09:08.546415 master-0 kubenswrapper[17644]: I0319 12:09:08.546419 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47d6a091-6854-4e44-8e7c-b2089cae286c-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"47d6a091-6854-4e44-8e7c-b2089cae286c\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 12:09:08.546740 master-0 kubenswrapper[17644]: I0319 12:09:08.546477 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47d6a091-6854-4e44-8e7c-b2089cae286c-kube-api-access\") pod \"installer-4-master-0\" (UID: \"47d6a091-6854-4e44-8e7c-b2089cae286c\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 12:09:08.546740 master-0 kubenswrapper[17644]: I0319 12:09:08.546491 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/47d6a091-6854-4e44-8e7c-b2089cae286c-var-lock\") pod \"installer-4-master-0\" (UID: \"47d6a091-6854-4e44-8e7c-b2089cae286c\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 12:09:08.546740 master-0 kubenswrapper[17644]: I0319 12:09:08.546581 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47d6a091-6854-4e44-8e7c-b2089cae286c-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"47d6a091-6854-4e44-8e7c-b2089cae286c\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 12:09:08.794334 master-0 kubenswrapper[17644]: I0319 12:09:08.794236 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 12:09:08.831860 master-0 kubenswrapper[17644]: I0319 12:09:08.831735 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 19 12:09:08.836577 master-0 kubenswrapper[17644]: I0319 12:09:08.836531 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 19 12:09:10.493471 master-0 kubenswrapper[17644]: I0319 12:09:10.493407 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a918e1b-bc1e-4947-b33b-bb346a4221c2" path="/var/lib/kubelet/pods/3a918e1b-bc1e-4947-b33b-bb346a4221c2/volumes" Mar 19 12:09:10.990623 master-0 kubenswrapper[17644]: E0319 12:09:10.990552 17644 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-7-master-0: failed to fetch token: Timeout: request did not complete within requested timeout - context deadline exceeded Mar 19 12:09:10.990897 master-0 kubenswrapper[17644]: E0319 12:09:10.990668 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7def3099-f487-44d4-a1d5-2ae096ef8804-kube-api-access podName:7def3099-f487-44d4-a1d5-2ae096ef8804 nodeName:}" failed. No retries permitted until 2026-03-19 12:09:11.490641937 +0000 UTC m=+585.260600182 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/7def3099-f487-44d4-a1d5-2ae096ef8804-kube-api-access") pod "installer-7-master-0" (UID: "7def3099-f487-44d4-a1d5-2ae096ef8804") : failed to fetch token: Timeout: request did not complete within requested timeout - context deadline exceeded Mar 19 12:09:11.498667 master-0 kubenswrapper[17644]: I0319 12:09:11.498582 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7def3099-f487-44d4-a1d5-2ae096ef8804-kube-api-access\") pod \"installer-7-master-0\" (UID: \"7def3099-f487-44d4-a1d5-2ae096ef8804\") " pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 12:09:14.839073 master-0 kubenswrapper[17644]: I0319 12:09:14.839001 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-942g6_92e401a4-ed2f-46f7-924b-329d7b313e6a/cluster-baremetal-operator/1.log" Mar 19 12:09:14.840380 master-0 kubenswrapper[17644]: I0319 12:09:14.840347 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-942g6_92e401a4-ed2f-46f7-924b-329d7b313e6a/cluster-baremetal-operator/0.log" Mar 19 12:09:14.840469 master-0 kubenswrapper[17644]: I0319 12:09:14.840417 17644 generic.go:334] "Generic (PLEG): container finished" podID="92e401a4-ed2f-46f7-924b-329d7b313e6a" containerID="32d00ec9f79f3b36c30f20da42acbe4672f3d595a12272450bf9df8eb150b8ba" exitCode=1 Mar 19 12:09:14.840522 master-0 kubenswrapper[17644]: I0319 12:09:14.840467 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" event={"ID":"92e401a4-ed2f-46f7-924b-329d7b313e6a","Type":"ContainerDied","Data":"32d00ec9f79f3b36c30f20da42acbe4672f3d595a12272450bf9df8eb150b8ba"} Mar 19 12:09:14.840586 master-0 kubenswrapper[17644]: I0319 12:09:14.840527 17644 scope.go:117] "RemoveContainer" containerID="46876a7e063d974c121cff378937380f72a9002e08dc430717d4d702ce311e44" Mar 19 12:09:14.841253 master-0 kubenswrapper[17644]: I0319 12:09:14.841202 17644 scope.go:117] "RemoveContainer" containerID="32d00ec9f79f3b36c30f20da42acbe4672f3d595a12272450bf9df8eb150b8ba" Mar 19 12:09:14.841592 master-0 kubenswrapper[17644]: E0319 12:09:14.841483 17644 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-6f69995874-942g6_openshift-machine-api(92e401a4-ed2f-46f7-924b-329d7b313e6a)\"" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" podUID="92e401a4-ed2f-46f7-924b-329d7b313e6a" Mar 19 12:09:15.478353 master-0 kubenswrapper[17644]: E0319 12:09:15.478195 17644 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:09:05Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:09:05Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:09:05Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:09:05Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ddc5283caf2ced75a94ddf0e8a43c431889692007e8a875a187b25c35b45a9e2\\\"],\\\"sizeBytes\\\":2895807090},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:3ea089ab116e164d89b46dc077f87d9af22f525bc2d69403214f77ee3fd30161\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:d9cbffb5a2fd538c8f19b7174d2906286acdb37a574b9dce3f9da302074591ff\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746416849},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\\\"],\\\"sizeBytes\\\":1637455533},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:c9f7bbe4799eaacbfbb60eb906000d7a813a580d6a9740def7da774cbc4cf859\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cde1da53dadc54c24c10cab8fd3e67839ce68c33ec3b556c255a79167881966a\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252053726},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a55ec7ec64efd0f595d8084377b7e463a1807829b7617e5d4a9092dcd924c36\\\"],\\\"sizeBytes\\\":1238100502},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:aefc421cf2f5dba925f7c149d56ce14e910fbd969a4e22b5917fc912ca33a5b2\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:da1ee8c9ae2cb275833f329b3d793a9109915be16d938f208ec917b50d9dd66a\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1223644894},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:af0fe0ca926422a6471d5bf22fc0e682c36c24fba05496a3bdfac0b7d3733015\\\"],\\\"sizeBytes\\\":991832673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\"],\\\"sizeBytes\\\":943841779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b00c42562d477ef44d51f35950253a26d7debc7de86e53270831aafef5795c1\\\"],\\\"sizeBytes\\\":918289953},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f2c59d19eb73ad5c0f93b0a63003c1885f5297959c9c45b401d1a74aea6e76\\\"],\\\"sizeBytes\\\":880382887},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0a09f5a3ba4f60cce0145769509bab92553c8075d210af4ac058965d2ae11efa\\\"],\\\"sizeBytes\\\":876160834},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6bba3f73c0066e42b24839e0d29f5dce2f36436f0a11f9f5e1029bccc5ed6578\\\"],\\\"sizeBytes\\\":862657321},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:de91abd5ad76fb491881a75a0feb4b8ca5600ceb5e15a4b0b687ada01ea0a44c\\\"],\\\"sizeBytes\\\":862205633},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a9e8da5c6114f062b814936d4db7a47a04d248e160d6bb28ad4e4a081496ee4\\\"],\\\"sizeBytes\\\":772943435},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1e1faad2d9167d84e23585c1cea5962301845548043cf09578f943f79ca98016\\\"],\\\"sizeBytes\\\":687949580},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5e12e4dc52214d3ada5ba5106caebe079eac1d9292c2571a5fe83411ce8e900d\\\"],\\\"sizeBytes\\\":683195416},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa5e782406f71c048b1ac3a4bf5d1227ff4be81111114083ad4c7a209c6bfb5a\\\"],\\\"sizeBytes\\\":677942383},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5bbb8535e2496de8389585ebbe696e7d7b9bad2b27785ad8a30a0fc683b0a22d\\\"],\\\"sizeBytes\\\":633877280},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ec8fd46dfb35ed10e8f98933166f69ce579c2f35b8db03d21e4c34fc544553e4\\\"],\\\"sizeBytes\\\":621648710},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f3038df8df25746bb5095296d4e5740f2356f85c1ed8d43f1b3d281e294826e5\\\"],\\\"sizeBytes\\\":605698193},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998\\\"],\\\"sizeBytes\\\":589386806},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55\\\"],\\\"sizeBytes\\\":582154903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982\\\"],\\\"sizeBytes\\\":558211175},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:112a03f2411f871cdaca5f20daef71024dac710113d5f30897117a5a02f6b6f5\\\"],\\\"sizeBytes\\\":557428271},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"],\\\"sizeBytes\\\":548752816},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\"],\\\"sizeBytes\\\":529326739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b4c9cf268bb7abef7af187cd775d3f74d0bd33626250095428d53b705ee946\\\"],\\\"sizeBytes\\\":528956487},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\"],\\\"sizeBytes\\\":518384969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:59727c4b3fef19e5149675cf3350735bbfe2c6588a57654b2e4552dd719f58b1\\\"],\\\"sizeBytes\\\":517999161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\"],\\\"sizeBytes\\\":514984269},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:30a2f97d7785ce8b0ea5115e67c4554b64adefbc7856bcf6f4fe6cc7e938a310\\\"],\\\"sizeBytes\\\":513582374},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bfe394b58ec6195de8b8420e781b7630d85a412b9112d892fea903f92b783427\\\"],\\\"sizeBytes\\\":513221333},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"],\\\"sizeBytes\\\":512274055},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98bf5467a01195e20aeea7d6f0b130ddacc00b73bc5312253b8c34e7208538f8\\\"],\\\"sizeBytes\\\":512235769},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:77fff570657d2fa0bfb709b2c8b6665bae0bf90a2be981d8dbca56c674715098\\\"],\\\"sizeBytes\\\":511227324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adb9f6f2fd701863c7caed747df43f83d3569ba9388cfa33ea7219ac6a606b11\\\"],\\\"sizeBytes\\\":511164375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458\\\"],\\\"sizeBytes\\\":508888171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263\\\"],\\\"sizeBytes\\\":508544745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71\\\"],\\\"sizeBytes\\\":507972093},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3\\\"],\\\"sizeBytes\\\":506480167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\"],\\\"sizeBytes\\\":506395599},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7d4a034950346bcd4e36e9e2f1343e0cf7a10cf544963f33d09c7eb2a1bfc634\\\"],\\\"sizeBytes\\\":505345991},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\\\"],\\\"sizeBytes\\\":505246690},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1973d56a1097a48ea0ebf2c4dbae1ed86fa67bb0116f4962f7720d48aa337d27\\\"],\\\"sizeBytes\\\":504662731},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\\\"],\\\"sizeBytes\\\":504625081},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf72297fee61ec9950f6868881ad3e84be8692ca08f084b3d155d93a766c0823\\\"],\\\"sizeBytes\\\":502712961},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69\\\"],\\\"sizeBytes\\\":495994673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023\\\"],\\\"sizeBytes\\\":495065340},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:002dfb86e17ad8f5cc232a7d2dce183b23335c8ecb7e7d31dcf3e4446b390777\\\"],\\\"sizeBytes\\\":487159945}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:09:15.850137 master-0 kubenswrapper[17644]: I0319 12:09:15.850003 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-942g6_92e401a4-ed2f-46f7-924b-329d7b313e6a/cluster-baremetal-operator/1.log" Mar 19 12:09:16.489221 master-0 kubenswrapper[17644]: I0319 12:09:16.489179 17644 scope.go:117] "RemoveContainer" containerID="929340400aba9fe4e6bdc1b44a88bcc955ba57855b3d7ae839aa6c55c2194e53" Mar 19 12:09:16.489876 master-0 kubenswrapper[17644]: E0319 12:09:16.489852 17644 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-764k4_openshift-cluster-storage-operator(d625c81e-01cc-424a-997d-546a5204a72b)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" podUID="d625c81e-01cc-424a-997d-546a5204a72b" Mar 19 12:09:17.943095 master-0 kubenswrapper[17644]: E0319 12:09:17.939556 17644 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-controller-manager/installer-3-master-0: failed to fetch token: pods "installer-3-master-0" not found Mar 19 12:09:17.943095 master-0 kubenswrapper[17644]: E0319 12:09:17.939654 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a918e1b-bc1e-4947-b33b-bb346a4221c2-kube-api-access podName:3a918e1b-bc1e-4947-b33b-bb346a4221c2 nodeName:}" failed. No retries permitted until 2026-03-19 12:09:18.939634476 +0000 UTC m=+592.709592511 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/3a918e1b-bc1e-4947-b33b-bb346a4221c2-kube-api-access") pod "installer-3-master-0" (UID: "3a918e1b-bc1e-4947-b33b-bb346a4221c2") : failed to fetch token: pods "installer-3-master-0" not found Mar 19 12:09:17.952745 master-0 kubenswrapper[17644]: I0319 12:09:17.952618 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e40539b3-c74d-45b8-8526-d25a3a41c336-kube-api-access\") pod \"installer-4-master-0\" (UID: \"e40539b3-c74d-45b8-8526-d25a3a41c336\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 12:09:17.955816 master-0 kubenswrapper[17644]: I0319 12:09:17.955765 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47d6a091-6854-4e44-8e7c-b2089cae286c-kube-api-access\") pod \"installer-4-master-0\" (UID: \"47d6a091-6854-4e44-8e7c-b2089cae286c\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 12:09:17.975213 master-0 kubenswrapper[17644]: I0319 12:09:17.975154 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7def3099-f487-44d4-a1d5-2ae096ef8804-kube-api-access\") pod \"installer-7-master-0\" (UID: \"7def3099-f487-44d4-a1d5-2ae096ef8804\") " pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 12:09:17.976471 master-0 kubenswrapper[17644]: I0319 12:09:17.976431 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-4vwst" Mar 19 12:09:17.985880 master-0 kubenswrapper[17644]: I0319 12:09:17.985826 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 12:09:18.031740 master-0 kubenswrapper[17644]: I0319 12:09:18.031672 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3a918e1b-bc1e-4947-b33b-bb346a4221c2-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:18.265864 master-0 kubenswrapper[17644]: I0319 12:09:18.264300 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 12:09:18.265864 master-0 kubenswrapper[17644]: I0319 12:09:18.265105 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 12:09:18.683260 master-0 kubenswrapper[17644]: I0319 12:09:18.683145 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 19 12:09:18.709232 master-0 kubenswrapper[17644]: W0319 12:09:18.708186 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode40539b3_c74d_45b8_8526_d25a3a41c336.slice/crio-8e673fdc2a73d34bceac27f1f86506af398f6d45a54b2a2728987ee2b3c14168 WatchSource:0}: Error finding container 8e673fdc2a73d34bceac27f1f86506af398f6d45a54b2a2728987ee2b3c14168: Status 404 returned error can't find the container with id 8e673fdc2a73d34bceac27f1f86506af398f6d45a54b2a2728987ee2b3c14168 Mar 19 12:09:18.745959 master-0 kubenswrapper[17644]: I0319 12:09:18.744318 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-847fb46dcc-qwvn8"] Mar 19 12:09:18.760797 master-0 kubenswrapper[17644]: I0319 12:09:18.759262 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-7-master-0"] Mar 19 12:09:18.765234 master-0 kubenswrapper[17644]: I0319 12:09:18.765169 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d9758b5c6-n2b98"] Mar 19 12:09:18.837834 master-0 kubenswrapper[17644]: I0319 12:09:18.836586 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 19 12:09:18.923447 master-0 kubenswrapper[17644]: I0319 12:09:18.923340 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-7-master-0" event={"ID":"7def3099-f487-44d4-a1d5-2ae096ef8804","Type":"ContainerStarted","Data":"a175aaff4983d76d4df244070ffa7754983086ec35c53d253a59e2fc3c4007f9"} Mar 19 12:09:18.925958 master-0 kubenswrapper[17644]: I0319 12:09:18.925889 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"47d6a091-6854-4e44-8e7c-b2089cae286c","Type":"ContainerStarted","Data":"c69e0702250c60cdbd0576be35b123739c887d356444f6df4e557577b9a12928"} Mar 19 12:09:18.927399 master-0 kubenswrapper[17644]: I0319 12:09:18.927174 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"e40539b3-c74d-45b8-8526-d25a3a41c336","Type":"ContainerStarted","Data":"8e673fdc2a73d34bceac27f1f86506af398f6d45a54b2a2728987ee2b3c14168"} Mar 19 12:09:19.937313 master-0 kubenswrapper[17644]: I0319 12:09:19.936871 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-7-master-0" event={"ID":"7def3099-f487-44d4-a1d5-2ae096ef8804","Type":"ContainerStarted","Data":"494afb441050ae61d98bfcdbb49df2010d70e4c70da80aefeef6526c1b9b02d2"} Mar 19 12:09:19.944465 master-0 kubenswrapper[17644]: I0319 12:09:19.940921 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"47d6a091-6854-4e44-8e7c-b2089cae286c","Type":"ContainerStarted","Data":"10280c512d50c98469a1460825a080495daf0b68a956dd42c1acbb90ff4776d5"} Mar 19 12:09:19.946475 master-0 kubenswrapper[17644]: I0319 12:09:19.946420 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"e40539b3-c74d-45b8-8526-d25a3a41c336","Type":"ContainerStarted","Data":"2adfba7d45404824eb5262853f4842c457955e93a85077eac87789dcfd1a1821"} Mar 19 12:09:19.961769 master-0 kubenswrapper[17644]: I0319 12:09:19.961662 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-7-master-0" podStartSLOduration=43.961639458 podStartE2EDuration="43.961639458s" podCreationTimestamp="2026-03-19 12:08:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:09:19.955140273 +0000 UTC m=+593.725098328" watchObservedRunningTime="2026-03-19 12:09:19.961639458 +0000 UTC m=+593.731597483" Mar 19 12:09:19.981580 master-0 kubenswrapper[17644]: I0319 12:09:19.981441 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-4-master-0" podStartSLOduration=11.981397562 podStartE2EDuration="11.981397562s" podCreationTimestamp="2026-03-19 12:09:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:09:19.977485428 +0000 UTC m=+593.747443483" watchObservedRunningTime="2026-03-19 12:09:19.981397562 +0000 UTC m=+593.751355607" Mar 19 12:09:20.006527 master-0 kubenswrapper[17644]: I0319 12:09:20.006398 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-4-master-0" podStartSLOduration=56.00636941 podStartE2EDuration="56.00636941s" podCreationTimestamp="2026-03-19 12:08:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:09:19.999389983 +0000 UTC m=+593.769348048" watchObservedRunningTime="2026-03-19 12:09:20.00636941 +0000 UTC m=+593.776327455" Mar 19 12:09:20.483930 master-0 kubenswrapper[17644]: I0319 12:09:20.483889 17644 scope.go:117] "RemoveContainer" containerID="89e2fb0436aaf5420411b2928148d341d7b6c5d6dab580ec01bca9d61919925b" Mar 19 12:09:20.956791 master-0 kubenswrapper[17644]: I0319 12:09:20.956714 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager_controller-manager-b485796d4-dqrfs_97f5b7e8-eee9-42b1-a23e-8a74f1ce4585/controller-manager/1.log" Mar 19 12:09:20.957293 master-0 kubenswrapper[17644]: I0319 12:09:20.956892 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" event={"ID":"97f5b7e8-eee9-42b1-a23e-8a74f1ce4585","Type":"ContainerStarted","Data":"9d6e6604ada5e78dd38c4fad68ecca6a07dcca232f6fa18b87dc43c0c148aded"} Mar 19 12:09:26.204878 master-0 kubenswrapper[17644]: E0319 12:09:26.204633 17644 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{downloads-66b8ffb895-vqnnc.189e3c913a9ae911 openshift-console 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-console,Name:downloads-66b8ffb895-vqnnc,UID:e17d22fe-fe0f-448e-9666-882d888d3ad4,APIVersion:v1,ResourceVersion:14739,FieldPath:spec.containers{download-server},},Reason:Unhealthy,Message:Readiness probe failed: Get \"http://10.128.0.95:8080/\": dial tcp 10.128.0.95:8080: connect: connection refused,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 12:04:57.681709329 +0000 UTC m=+331.451667394,LastTimestamp:2026-03-19 12:04:57.681709329 +0000 UTC m=+331.451667394,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 12:09:27.481069 master-0 kubenswrapper[17644]: I0319 12:09:27.480766 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" Mar 19 12:09:27.485884 master-0 kubenswrapper[17644]: I0319 12:09:27.484549 17644 scope.go:117] "RemoveContainer" containerID="929340400aba9fe4e6bdc1b44a88bcc955ba57855b3d7ae839aa6c55c2194e53" Mar 19 12:09:27.485884 master-0 kubenswrapper[17644]: I0319 12:09:27.484695 17644 scope.go:117] "RemoveContainer" containerID="32d00ec9f79f3b36c30f20da42acbe4672f3d595a12272450bf9df8eb150b8ba" Mar 19 12:09:27.485884 master-0 kubenswrapper[17644]: E0319 12:09:27.484837 17644 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-764k4_openshift-cluster-storage-operator(d625c81e-01cc-424a-997d-546a5204a72b)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" podUID="d625c81e-01cc-424a-997d-546a5204a72b" Mar 19 12:09:27.485884 master-0 kubenswrapper[17644]: I0319 12:09:27.485790 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b485796d4-dqrfs" Mar 19 12:09:28.014750 master-0 kubenswrapper[17644]: I0319 12:09:28.014661 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-942g6_92e401a4-ed2f-46f7-924b-329d7b313e6a/cluster-baremetal-operator/1.log" Mar 19 12:09:28.015457 master-0 kubenswrapper[17644]: I0319 12:09:28.015375 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-942g6" event={"ID":"92e401a4-ed2f-46f7-924b-329d7b313e6a","Type":"ContainerStarted","Data":"a248996b027a9d0fa8e933bb019265e014e67fb8b42e9a157e86b026d5a07fd2"} Mar 19 12:09:41.483647 master-0 kubenswrapper[17644]: I0319 12:09:41.483590 17644 scope.go:117] "RemoveContainer" containerID="929340400aba9fe4e6bdc1b44a88bcc955ba57855b3d7ae839aa6c55c2194e53" Mar 19 12:09:41.484321 master-0 kubenswrapper[17644]: E0319 12:09:41.483877 17644 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-764k4_openshift-cluster-storage-operator(d625c81e-01cc-424a-997d-546a5204a72b)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" podUID="d625c81e-01cc-424a-997d-546a5204a72b" Mar 19 12:09:43.778120 master-0 kubenswrapper[17644]: I0319 12:09:43.778032 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" podUID="3e43eccd-712d-459a-92af-5c2e900409c0" containerName="oauth-openshift" containerID="cri-o://04f45dfb302524bd6eb9768c32c0f5f01aa67d750a9b740fc57bde47ed7c9d45" gracePeriod=15 Mar 19 12:09:43.814999 master-0 kubenswrapper[17644]: I0319 12:09:43.814952 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-d9758b5c6-n2b98" podUID="f636deab-3372-4c36-b492-df2442da1e31" containerName="console" containerID="cri-o://240d62d6290746118c3fa599222f62e950c62cff164d71a917712346ad5fd3da" gracePeriod=15 Mar 19 12:09:44.128137 master-0 kubenswrapper[17644]: I0319 12:09:44.128009 17644 generic.go:334] "Generic (PLEG): container finished" podID="3e43eccd-712d-459a-92af-5c2e900409c0" containerID="04f45dfb302524bd6eb9768c32c0f5f01aa67d750a9b740fc57bde47ed7c9d45" exitCode=0 Mar 19 12:09:44.128137 master-0 kubenswrapper[17644]: I0319 12:09:44.128102 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" event={"ID":"3e43eccd-712d-459a-92af-5c2e900409c0","Type":"ContainerDied","Data":"04f45dfb302524bd6eb9768c32c0f5f01aa67d750a9b740fc57bde47ed7c9d45"} Mar 19 12:09:44.130174 master-0 kubenswrapper[17644]: I0319 12:09:44.130133 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d9758b5c6-n2b98_f636deab-3372-4c36-b492-df2442da1e31/console/0.log" Mar 19 12:09:44.130240 master-0 kubenswrapper[17644]: I0319 12:09:44.130197 17644 generic.go:334] "Generic (PLEG): container finished" podID="f636deab-3372-4c36-b492-df2442da1e31" containerID="240d62d6290746118c3fa599222f62e950c62cff164d71a917712346ad5fd3da" exitCode=2 Mar 19 12:09:44.130276 master-0 kubenswrapper[17644]: I0319 12:09:44.130234 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d9758b5c6-n2b98" event={"ID":"f636deab-3372-4c36-b492-df2442da1e31","Type":"ContainerDied","Data":"240d62d6290746118c3fa599222f62e950c62cff164d71a917712346ad5fd3da"} Mar 19 12:09:44.488627 master-0 kubenswrapper[17644]: I0319 12:09:44.488538 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:09:44.557050 master-0 kubenswrapper[17644]: I0319 12:09:44.557012 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d9758b5c6-n2b98_f636deab-3372-4c36-b492-df2442da1e31/console/0.log" Mar 19 12:09:44.557265 master-0 kubenswrapper[17644]: I0319 12:09:44.557083 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:09:44.630567 master-0 kubenswrapper[17644]: I0319 12:09:44.630508 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-service-ca\") pod \"3e43eccd-712d-459a-92af-5c2e900409c0\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " Mar 19 12:09:44.630809 master-0 kubenswrapper[17644]: I0319 12:09:44.630612 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqp8z\" (UniqueName: \"kubernetes.io/projected/3e43eccd-712d-459a-92af-5c2e900409c0-kube-api-access-vqp8z\") pod \"3e43eccd-712d-459a-92af-5c2e900409c0\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " Mar 19 12:09:44.631028 master-0 kubenswrapper[17644]: I0319 12:09:44.630663 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-trusted-ca-bundle\") pod \"3e43eccd-712d-459a-92af-5c2e900409c0\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " Mar 19 12:09:44.631089 master-0 kubenswrapper[17644]: I0319 12:09:44.631050 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-router-certs\") pod \"3e43eccd-712d-459a-92af-5c2e900409c0\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " Mar 19 12:09:44.631089 master-0 kubenswrapper[17644]: I0319 12:09:44.631074 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-user-template-login\") pod \"3e43eccd-712d-459a-92af-5c2e900409c0\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " Mar 19 12:09:44.631172 master-0 kubenswrapper[17644]: I0319 12:09:44.631127 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-user-template-error\") pod \"3e43eccd-712d-459a-92af-5c2e900409c0\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " Mar 19 12:09:44.631172 master-0 kubenswrapper[17644]: I0319 12:09:44.631156 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e43eccd-712d-459a-92af-5c2e900409c0-audit-dir\") pod \"3e43eccd-712d-459a-92af-5c2e900409c0\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " Mar 19 12:09:44.631248 master-0 kubenswrapper[17644]: I0319 12:09:44.631179 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-ocp-branding-template\") pod \"3e43eccd-712d-459a-92af-5c2e900409c0\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " Mar 19 12:09:44.631248 master-0 kubenswrapper[17644]: I0319 12:09:44.631218 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-serving-cert\") pod \"3e43eccd-712d-459a-92af-5c2e900409c0\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " Mar 19 12:09:44.631248 master-0 kubenswrapper[17644]: I0319 12:09:44.631237 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-cliconfig\") pod \"3e43eccd-712d-459a-92af-5c2e900409c0\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " Mar 19 12:09:44.631333 master-0 kubenswrapper[17644]: I0319 12:09:44.631259 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-session\") pod \"3e43eccd-712d-459a-92af-5c2e900409c0\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " Mar 19 12:09:44.631364 master-0 kubenswrapper[17644]: I0319 12:09:44.631339 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-user-template-provider-selection\") pod \"3e43eccd-712d-459a-92af-5c2e900409c0\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " Mar 19 12:09:44.631364 master-0 kubenswrapper[17644]: I0319 12:09:44.631326 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "3e43eccd-712d-459a-92af-5c2e900409c0" (UID: "3e43eccd-712d-459a-92af-5c2e900409c0"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:09:44.631427 master-0 kubenswrapper[17644]: I0319 12:09:44.631395 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3e43eccd-712d-459a-92af-5c2e900409c0-audit-policies\") pod \"3e43eccd-712d-459a-92af-5c2e900409c0\" (UID: \"3e43eccd-712d-459a-92af-5c2e900409c0\") " Mar 19 12:09:44.631673 master-0 kubenswrapper[17644]: I0319 12:09:44.631560 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e43eccd-712d-459a-92af-5c2e900409c0-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "3e43eccd-712d-459a-92af-5c2e900409c0" (UID: "3e43eccd-712d-459a-92af-5c2e900409c0"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:09:44.631988 master-0 kubenswrapper[17644]: I0319 12:09:44.631935 17644 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e43eccd-712d-459a-92af-5c2e900409c0-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:44.631988 master-0 kubenswrapper[17644]: I0319 12:09:44.631965 17644 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:44.632147 master-0 kubenswrapper[17644]: I0319 12:09:44.632020 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "3e43eccd-712d-459a-92af-5c2e900409c0" (UID: "3e43eccd-712d-459a-92af-5c2e900409c0"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:09:44.632427 master-0 kubenswrapper[17644]: I0319 12:09:44.632399 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e43eccd-712d-459a-92af-5c2e900409c0-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "3e43eccd-712d-459a-92af-5c2e900409c0" (UID: "3e43eccd-712d-459a-92af-5c2e900409c0"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:09:44.632698 master-0 kubenswrapper[17644]: I0319 12:09:44.632670 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "3e43eccd-712d-459a-92af-5c2e900409c0" (UID: "3e43eccd-712d-459a-92af-5c2e900409c0"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:09:44.634548 master-0 kubenswrapper[17644]: I0319 12:09:44.634513 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "3e43eccd-712d-459a-92af-5c2e900409c0" (UID: "3e43eccd-712d-459a-92af-5c2e900409c0"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:09:44.634877 master-0 kubenswrapper[17644]: I0319 12:09:44.634849 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "3e43eccd-712d-459a-92af-5c2e900409c0" (UID: "3e43eccd-712d-459a-92af-5c2e900409c0"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:09:44.634982 master-0 kubenswrapper[17644]: I0319 12:09:44.634940 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "3e43eccd-712d-459a-92af-5c2e900409c0" (UID: "3e43eccd-712d-459a-92af-5c2e900409c0"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:09:44.635289 master-0 kubenswrapper[17644]: I0319 12:09:44.635224 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "3e43eccd-712d-459a-92af-5c2e900409c0" (UID: "3e43eccd-712d-459a-92af-5c2e900409c0"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:09:44.635343 master-0 kubenswrapper[17644]: I0319 12:09:44.635317 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "3e43eccd-712d-459a-92af-5c2e900409c0" (UID: "3e43eccd-712d-459a-92af-5c2e900409c0"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:09:44.635642 master-0 kubenswrapper[17644]: I0319 12:09:44.635602 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "3e43eccd-712d-459a-92af-5c2e900409c0" (UID: "3e43eccd-712d-459a-92af-5c2e900409c0"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:09:44.635702 master-0 kubenswrapper[17644]: I0319 12:09:44.635648 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e43eccd-712d-459a-92af-5c2e900409c0-kube-api-access-vqp8z" (OuterVolumeSpecName: "kube-api-access-vqp8z") pod "3e43eccd-712d-459a-92af-5c2e900409c0" (UID: "3e43eccd-712d-459a-92af-5c2e900409c0"). InnerVolumeSpecName "kube-api-access-vqp8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:09:44.636136 master-0 kubenswrapper[17644]: I0319 12:09:44.636100 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "3e43eccd-712d-459a-92af-5c2e900409c0" (UID: "3e43eccd-712d-459a-92af-5c2e900409c0"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:09:44.734756 master-0 kubenswrapper[17644]: I0319 12:09:44.734527 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f636deab-3372-4c36-b492-df2442da1e31-console-config\") pod \"f636deab-3372-4c36-b492-df2442da1e31\" (UID: \"f636deab-3372-4c36-b492-df2442da1e31\") " Mar 19 12:09:44.734756 master-0 kubenswrapper[17644]: I0319 12:09:44.734587 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f636deab-3372-4c36-b492-df2442da1e31-console-oauth-config\") pod \"f636deab-3372-4c36-b492-df2442da1e31\" (UID: \"f636deab-3372-4c36-b492-df2442da1e31\") " Mar 19 12:09:44.735158 master-0 kubenswrapper[17644]: I0319 12:09:44.734818 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmlbh\" (UniqueName: \"kubernetes.io/projected/f636deab-3372-4c36-b492-df2442da1e31-kube-api-access-cmlbh\") pod \"f636deab-3372-4c36-b492-df2442da1e31\" (UID: \"f636deab-3372-4c36-b492-df2442da1e31\") " Mar 19 12:09:44.735158 master-0 kubenswrapper[17644]: I0319 12:09:44.734989 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f636deab-3372-4c36-b492-df2442da1e31-console-serving-cert\") pod \"f636deab-3372-4c36-b492-df2442da1e31\" (UID: \"f636deab-3372-4c36-b492-df2442da1e31\") " Mar 19 12:09:44.735158 master-0 kubenswrapper[17644]: I0319 12:09:44.735119 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f636deab-3372-4c36-b492-df2442da1e31-service-ca\") pod \"f636deab-3372-4c36-b492-df2442da1e31\" (UID: \"f636deab-3372-4c36-b492-df2442da1e31\") " Mar 19 12:09:44.735316 master-0 kubenswrapper[17644]: I0319 12:09:44.735183 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f636deab-3372-4c36-b492-df2442da1e31-oauth-serving-cert\") pod \"f636deab-3372-4c36-b492-df2442da1e31\" (UID: \"f636deab-3372-4c36-b492-df2442da1e31\") " Mar 19 12:09:44.735381 master-0 kubenswrapper[17644]: I0319 12:09:44.735295 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f636deab-3372-4c36-b492-df2442da1e31-console-config" (OuterVolumeSpecName: "console-config") pod "f636deab-3372-4c36-b492-df2442da1e31" (UID: "f636deab-3372-4c36-b492-df2442da1e31"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:09:44.735887 master-0 kubenswrapper[17644]: I0319 12:09:44.735802 17644 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:44.735887 master-0 kubenswrapper[17644]: I0319 12:09:44.735834 17644 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3e43eccd-712d-459a-92af-5c2e900409c0-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:44.735887 master-0 kubenswrapper[17644]: I0319 12:09:44.735849 17644 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:44.735887 master-0 kubenswrapper[17644]: I0319 12:09:44.735862 17644 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f636deab-3372-4c36-b492-df2442da1e31-console-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:44.735887 master-0 kubenswrapper[17644]: I0319 12:09:44.735884 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqp8z\" (UniqueName: \"kubernetes.io/projected/3e43eccd-712d-459a-92af-5c2e900409c0-kube-api-access-vqp8z\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:44.736146 master-0 kubenswrapper[17644]: I0319 12:09:44.735897 17644 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:44.736146 master-0 kubenswrapper[17644]: I0319 12:09:44.735911 17644 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:44.736146 master-0 kubenswrapper[17644]: I0319 12:09:44.735923 17644 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:44.736146 master-0 kubenswrapper[17644]: I0319 12:09:44.735938 17644 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:44.736146 master-0 kubenswrapper[17644]: I0319 12:09:44.735951 17644 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:44.736146 master-0 kubenswrapper[17644]: I0319 12:09:44.735964 17644 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:44.736146 master-0 kubenswrapper[17644]: I0319 12:09:44.735977 17644 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3e43eccd-712d-459a-92af-5c2e900409c0-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:44.736413 master-0 kubenswrapper[17644]: I0319 12:09:44.736248 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f636deab-3372-4c36-b492-df2442da1e31-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f636deab-3372-4c36-b492-df2442da1e31" (UID: "f636deab-3372-4c36-b492-df2442da1e31"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:09:44.736871 master-0 kubenswrapper[17644]: I0319 12:09:44.736603 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f636deab-3372-4c36-b492-df2442da1e31-service-ca" (OuterVolumeSpecName: "service-ca") pod "f636deab-3372-4c36-b492-df2442da1e31" (UID: "f636deab-3372-4c36-b492-df2442da1e31"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:09:44.738923 master-0 kubenswrapper[17644]: I0319 12:09:44.738881 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f636deab-3372-4c36-b492-df2442da1e31-kube-api-access-cmlbh" (OuterVolumeSpecName: "kube-api-access-cmlbh") pod "f636deab-3372-4c36-b492-df2442da1e31" (UID: "f636deab-3372-4c36-b492-df2442da1e31"). InnerVolumeSpecName "kube-api-access-cmlbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:09:44.739346 master-0 kubenswrapper[17644]: I0319 12:09:44.739304 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f636deab-3372-4c36-b492-df2442da1e31-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f636deab-3372-4c36-b492-df2442da1e31" (UID: "f636deab-3372-4c36-b492-df2442da1e31"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:09:44.739489 master-0 kubenswrapper[17644]: I0319 12:09:44.739460 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f636deab-3372-4c36-b492-df2442da1e31-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f636deab-3372-4c36-b492-df2442da1e31" (UID: "f636deab-3372-4c36-b492-df2442da1e31"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:09:44.838748 master-0 kubenswrapper[17644]: I0319 12:09:44.838616 17644 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f636deab-3372-4c36-b492-df2442da1e31-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:44.838748 master-0 kubenswrapper[17644]: I0319 12:09:44.838688 17644 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f636deab-3372-4c36-b492-df2442da1e31-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:44.838748 master-0 kubenswrapper[17644]: I0319 12:09:44.838698 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmlbh\" (UniqueName: \"kubernetes.io/projected/f636deab-3372-4c36-b492-df2442da1e31-kube-api-access-cmlbh\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:44.838748 master-0 kubenswrapper[17644]: I0319 12:09:44.838707 17644 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f636deab-3372-4c36-b492-df2442da1e31-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:44.838748 master-0 kubenswrapper[17644]: I0319 12:09:44.838718 17644 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f636deab-3372-4c36-b492-df2442da1e31-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:45.139275 master-0 kubenswrapper[17644]: I0319 12:09:45.139169 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d9758b5c6-n2b98_f636deab-3372-4c36-b492-df2442da1e31/console/0.log" Mar 19 12:09:45.139466 master-0 kubenswrapper[17644]: I0319 12:09:45.139324 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d9758b5c6-n2b98" event={"ID":"f636deab-3372-4c36-b492-df2442da1e31","Type":"ContainerDied","Data":"8659f435a458173cfde862d58e5a1c333c45e91207e97205273a5b46661fbad0"} Mar 19 12:09:45.139466 master-0 kubenswrapper[17644]: I0319 12:09:45.139370 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d9758b5c6-n2b98" Mar 19 12:09:45.139466 master-0 kubenswrapper[17644]: I0319 12:09:45.139378 17644 scope.go:117] "RemoveContainer" containerID="240d62d6290746118c3fa599222f62e950c62cff164d71a917712346ad5fd3da" Mar 19 12:09:45.143163 master-0 kubenswrapper[17644]: I0319 12:09:45.143132 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" event={"ID":"3e43eccd-712d-459a-92af-5c2e900409c0","Type":"ContainerDied","Data":"1081c0cd1af59e4de9012cc7cf2706812e9d825c27f618f134714732b6230dda"} Mar 19 12:09:45.143277 master-0 kubenswrapper[17644]: I0319 12:09:45.143238 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-847fb46dcc-qwvn8" Mar 19 12:09:45.156753 master-0 kubenswrapper[17644]: I0319 12:09:45.156569 17644 scope.go:117] "RemoveContainer" containerID="04f45dfb302524bd6eb9768c32c0f5f01aa67d750a9b740fc57bde47ed7c9d45" Mar 19 12:09:45.194853 master-0 kubenswrapper[17644]: I0319 12:09:45.193440 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d9758b5c6-n2b98"] Mar 19 12:09:45.200467 master-0 kubenswrapper[17644]: I0319 12:09:45.200401 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-d9758b5c6-n2b98"] Mar 19 12:09:45.204912 master-0 kubenswrapper[17644]: I0319 12:09:45.204870 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-847fb46dcc-qwvn8"] Mar 19 12:09:45.209232 master-0 kubenswrapper[17644]: I0319 12:09:45.209161 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-847fb46dcc-qwvn8"] Mar 19 12:09:46.490989 master-0 kubenswrapper[17644]: I0319 12:09:46.490946 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e43eccd-712d-459a-92af-5c2e900409c0" path="/var/lib/kubelet/pods/3e43eccd-712d-459a-92af-5c2e900409c0/volumes" Mar 19 12:09:46.491709 master-0 kubenswrapper[17644]: I0319 12:09:46.491683 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f636deab-3372-4c36-b492-df2442da1e31" path="/var/lib/kubelet/pods/f636deab-3372-4c36-b492-df2442da1e31/volumes" Mar 19 12:09:47.014049 master-0 kubenswrapper[17644]: I0319 12:09:47.013979 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 12:09:47.014395 master-0 kubenswrapper[17644]: I0319 12:09:47.014352 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="alertmanager" containerID="cri-o://2543d1abea93d0c704c4c0db59eb4a042ce21c8d13bf468370e1bf83e2ab8472" gracePeriod=120 Mar 19 12:09:47.014500 master-0 kubenswrapper[17644]: I0319 12:09:47.014447 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="kube-rbac-proxy-metric" containerID="cri-o://7f2b5ea288b54601e650c6ff62459c9ee6250acdb744f9a5c0b44b5469782a4a" gracePeriod=120 Mar 19 12:09:47.014568 master-0 kubenswrapper[17644]: I0319 12:09:47.014511 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="prom-label-proxy" containerID="cri-o://a8631f6d1fc66da871d0154b3f1477651cfe5d30c25d45308d569545d8f58367" gracePeriod=120 Mar 19 12:09:47.014653 master-0 kubenswrapper[17644]: I0319 12:09:47.014571 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="kube-rbac-proxy-web" containerID="cri-o://880632dd345de6706363ebcb7ea267f4f31bac46efa67ff9fad79f362a13e5ae" gracePeriod=120 Mar 19 12:09:47.014694 master-0 kubenswrapper[17644]: I0319 12:09:47.014536 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="config-reloader" containerID="cri-o://eb175aac5718f4058eadd585a688cb34e8c72deb365d141c863bb48284d27075" gracePeriod=120 Mar 19 12:09:47.014762 master-0 kubenswrapper[17644]: I0319 12:09:47.014618 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="kube-rbac-proxy" containerID="cri-o://3363d72a93f53effdf1582ca8bebe3f83cc8a44bae0001ada896373248911862" gracePeriod=120 Mar 19 12:09:47.162620 master-0 kubenswrapper[17644]: I0319 12:09:47.162572 17644 generic.go:334] "Generic (PLEG): container finished" podID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerID="a8631f6d1fc66da871d0154b3f1477651cfe5d30c25d45308d569545d8f58367" exitCode=0 Mar 19 12:09:47.162620 master-0 kubenswrapper[17644]: I0319 12:09:47.162607 17644 generic.go:334] "Generic (PLEG): container finished" podID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerID="3363d72a93f53effdf1582ca8bebe3f83cc8a44bae0001ada896373248911862" exitCode=0 Mar 19 12:09:47.162620 master-0 kubenswrapper[17644]: I0319 12:09:47.162615 17644 generic.go:334] "Generic (PLEG): container finished" podID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerID="eb175aac5718f4058eadd585a688cb34e8c72deb365d141c863bb48284d27075" exitCode=0 Mar 19 12:09:47.162620 master-0 kubenswrapper[17644]: I0319 12:09:47.162626 17644 generic.go:334] "Generic (PLEG): container finished" podID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerID="2543d1abea93d0c704c4c0db59eb4a042ce21c8d13bf468370e1bf83e2ab8472" exitCode=0 Mar 19 12:09:47.163014 master-0 kubenswrapper[17644]: I0319 12:09:47.162660 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8","Type":"ContainerDied","Data":"a8631f6d1fc66da871d0154b3f1477651cfe5d30c25d45308d569545d8f58367"} Mar 19 12:09:47.163014 master-0 kubenswrapper[17644]: I0319 12:09:47.162716 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8","Type":"ContainerDied","Data":"3363d72a93f53effdf1582ca8bebe3f83cc8a44bae0001ada896373248911862"} Mar 19 12:09:47.163014 master-0 kubenswrapper[17644]: I0319 12:09:47.162775 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8","Type":"ContainerDied","Data":"eb175aac5718f4058eadd585a688cb34e8c72deb365d141c863bb48284d27075"} Mar 19 12:09:47.163014 master-0 kubenswrapper[17644]: I0319 12:09:47.162796 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8","Type":"ContainerDied","Data":"2543d1abea93d0c704c4c0db59eb4a042ce21c8d13bf468370e1bf83e2ab8472"} Mar 19 12:09:48.173750 master-0 kubenswrapper[17644]: I0319 12:09:48.172080 17644 generic.go:334] "Generic (PLEG): container finished" podID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerID="7f2b5ea288b54601e650c6ff62459c9ee6250acdb744f9a5c0b44b5469782a4a" exitCode=0 Mar 19 12:09:48.173750 master-0 kubenswrapper[17644]: I0319 12:09:48.172111 17644 generic.go:334] "Generic (PLEG): container finished" podID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerID="880632dd345de6706363ebcb7ea267f4f31bac46efa67ff9fad79f362a13e5ae" exitCode=0 Mar 19 12:09:48.173750 master-0 kubenswrapper[17644]: I0319 12:09:48.172130 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8","Type":"ContainerDied","Data":"7f2b5ea288b54601e650c6ff62459c9ee6250acdb744f9a5c0b44b5469782a4a"} Mar 19 12:09:48.173750 master-0 kubenswrapper[17644]: I0319 12:09:48.172155 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8","Type":"ContainerDied","Data":"880632dd345de6706363ebcb7ea267f4f31bac46efa67ff9fad79f362a13e5ae"} Mar 19 12:09:48.555874 master-0 kubenswrapper[17644]: I0319 12:09:48.555801 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:09:48.692497 master-0 kubenswrapper[17644]: I0319 12:09:48.692425 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4bm\" (UniqueName: \"kubernetes.io/projected/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-kube-api-access-2d4bm\") pod \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " Mar 19 12:09:48.692497 master-0 kubenswrapper[17644]: I0319 12:09:48.692488 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-secret-alertmanager-kube-rbac-proxy\") pod \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " Mar 19 12:09:48.692810 master-0 kubenswrapper[17644]: I0319 12:09:48.692551 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-config-out\") pod \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " Mar 19 12:09:48.692810 master-0 kubenswrapper[17644]: I0319 12:09:48.692576 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-web-config\") pod \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " Mar 19 12:09:48.693123 master-0 kubenswrapper[17644]: I0319 12:09:48.693092 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-tls-assets\") pod \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " Mar 19 12:09:48.693185 master-0 kubenswrapper[17644]: I0319 12:09:48.693174 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-secret-alertmanager-main-tls\") pod \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " Mar 19 12:09:48.693700 master-0 kubenswrapper[17644]: I0319 12:09:48.693674 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-metrics-client-ca\") pod \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " Mar 19 12:09:48.694310 master-0 kubenswrapper[17644]: I0319 12:09:48.694276 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" (UID: "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:09:48.694398 master-0 kubenswrapper[17644]: I0319 12:09:48.694375 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-trusted-ca-bundle\") pod \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " Mar 19 12:09:48.694444 master-0 kubenswrapper[17644]: I0319 12:09:48.694423 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " Mar 19 12:09:48.694476 master-0 kubenswrapper[17644]: I0319 12:09:48.694415 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" (UID: "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:09:48.694821 master-0 kubenswrapper[17644]: I0319 12:09:48.694796 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-main-db\") pod \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " Mar 19 12:09:48.694885 master-0 kubenswrapper[17644]: I0319 12:09:48.694872 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-secret-alertmanager-kube-rbac-proxy-web\") pod \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " Mar 19 12:09:48.694951 master-0 kubenswrapper[17644]: I0319 12:09:48.694910 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-config-volume\") pod \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\" (UID: \"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8\") " Mar 19 12:09:48.695661 master-0 kubenswrapper[17644]: I0319 12:09:48.695632 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" (UID: "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:09:48.695825 master-0 kubenswrapper[17644]: I0319 12:09:48.695804 17644 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-secret-alertmanager-main-tls\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:48.695895 master-0 kubenswrapper[17644]: I0319 12:09:48.695827 17644 reconciler_common.go:293] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-metrics-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:48.696884 master-0 kubenswrapper[17644]: I0319 12:09:48.696810 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" (UID: "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:09:48.696969 master-0 kubenswrapper[17644]: I0319 12:09:48.696911 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-kube-api-access-2d4bm" (OuterVolumeSpecName: "kube-api-access-2d4bm") pod "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" (UID: "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8"). InnerVolumeSpecName "kube-api-access-2d4bm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:09:48.704757 master-0 kubenswrapper[17644]: I0319 12:09:48.697108 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" (UID: "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:09:48.704757 master-0 kubenswrapper[17644]: I0319 12:09:48.697171 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-config-out" (OuterVolumeSpecName: "config-out") pod "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" (UID: "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:09:48.704757 master-0 kubenswrapper[17644]: I0319 12:09:48.695841 17644 reconciler_common.go:293] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:48.704757 master-0 kubenswrapper[17644]: I0319 12:09:48.697690 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" (UID: "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:09:48.704757 master-0 kubenswrapper[17644]: I0319 12:09:48.698164 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-config-volume" (OuterVolumeSpecName: "config-volume") pod "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" (UID: "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:09:48.704757 master-0 kubenswrapper[17644]: I0319 12:09:48.698179 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" (UID: "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:09:48.707919 master-0 kubenswrapper[17644]: I0319 12:09:48.707323 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" (UID: "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:09:48.738809 master-0 kubenswrapper[17644]: I0319 12:09:48.738748 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-web-config" (OuterVolumeSpecName: "web-config") pod "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" (UID: "8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:09:48.797938 master-0 kubenswrapper[17644]: I0319 12:09:48.797832 17644 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-secret-alertmanager-kube-rbac-proxy-web\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:48.797938 master-0 kubenswrapper[17644]: I0319 12:09:48.797868 17644 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-config-volume\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:48.797938 master-0 kubenswrapper[17644]: I0319 12:09:48.797880 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4bm\" (UniqueName: \"kubernetes.io/projected/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-kube-api-access-2d4bm\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:48.797938 master-0 kubenswrapper[17644]: I0319 12:09:48.797889 17644 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-secret-alertmanager-kube-rbac-proxy\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:48.797938 master-0 kubenswrapper[17644]: I0319 12:09:48.797899 17644 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-config-out\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:48.797938 master-0 kubenswrapper[17644]: I0319 12:09:48.797908 17644 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-web-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:48.797938 master-0 kubenswrapper[17644]: I0319 12:09:48.797915 17644 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-tls-assets\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:48.797938 master-0 kubenswrapper[17644]: I0319 12:09:48.797923 17644 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-secret-alertmanager-kube-rbac-proxy-metric\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:48.797938 master-0 kubenswrapper[17644]: I0319 12:09:48.797934 17644 reconciler_common.go:293] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8-alertmanager-main-db\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:49.181927 master-0 kubenswrapper[17644]: I0319 12:09:49.181879 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8","Type":"ContainerDied","Data":"e3653f9b87f745028de768fe39f1b91bd3018e50e0ce6566f3984acf7c35f394"} Mar 19 12:09:49.182456 master-0 kubenswrapper[17644]: I0319 12:09:49.181941 17644 scope.go:117] "RemoveContainer" containerID="a8631f6d1fc66da871d0154b3f1477651cfe5d30c25d45308d569545d8f58367" Mar 19 12:09:49.182456 master-0 kubenswrapper[17644]: I0319 12:09:49.181988 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:09:49.194882 master-0 kubenswrapper[17644]: I0319 12:09:49.194832 17644 scope.go:117] "RemoveContainer" containerID="7f2b5ea288b54601e650c6ff62459c9ee6250acdb744f9a5c0b44b5469782a4a" Mar 19 12:09:49.210090 master-0 kubenswrapper[17644]: I0319 12:09:49.210044 17644 scope.go:117] "RemoveContainer" containerID="3363d72a93f53effdf1582ca8bebe3f83cc8a44bae0001ada896373248911862" Mar 19 12:09:49.224111 master-0 kubenswrapper[17644]: I0319 12:09:49.223947 17644 scope.go:117] "RemoveContainer" containerID="880632dd345de6706363ebcb7ea267f4f31bac46efa67ff9fad79f362a13e5ae" Mar 19 12:09:49.231911 master-0 kubenswrapper[17644]: I0319 12:09:49.231853 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 12:09:49.237881 master-0 kubenswrapper[17644]: I0319 12:09:49.237833 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 12:09:49.243385 master-0 kubenswrapper[17644]: I0319 12:09:49.243361 17644 scope.go:117] "RemoveContainer" containerID="eb175aac5718f4058eadd585a688cb34e8c72deb365d141c863bb48284d27075" Mar 19 12:09:49.260317 master-0 kubenswrapper[17644]: I0319 12:09:49.260283 17644 scope.go:117] "RemoveContainer" containerID="2543d1abea93d0c704c4c0db59eb4a042ce21c8d13bf468370e1bf83e2ab8472" Mar 19 12:09:49.275521 master-0 kubenswrapper[17644]: I0319 12:09:49.275489 17644 scope.go:117] "RemoveContainer" containerID="b00de993f0e0b8655ba99ede10174e49c870ebb69aeba71e72944bf22b99febf" Mar 19 12:09:50.357873 master-0 kubenswrapper[17644]: I0319 12:09:50.357721 17644 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 12:09:50.358417 master-0 kubenswrapper[17644]: I0319 12:09:50.358122 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="8dd3d3608fe9c86b0f65904ec2353df4" containerName="kube-scheduler-cert-syncer" containerID="cri-o://6d72821ebade3dd7591d917fa0467249fc3edcd5dc0972864a90c11eda293f32" gracePeriod=30 Mar 19 12:09:50.358417 master-0 kubenswrapper[17644]: I0319 12:09:50.358216 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="8dd3d3608fe9c86b0f65904ec2353df4" containerName="kube-scheduler" containerID="cri-o://b3485db6aaddee4e4d797b661b5dd9b9c4e87879acbe3a148927365e2fb64f17" gracePeriod=30 Mar 19 12:09:50.358490 master-0 kubenswrapper[17644]: I0319 12:09:50.358364 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="8dd3d3608fe9c86b0f65904ec2353df4" containerName="kube-scheduler-recovery-controller" containerID="cri-o://614b9a040714905a39e760a6cecb6220bb3a230fd76a3172673a8f3b177db464" gracePeriod=30 Mar 19 12:09:50.358814 master-0 kubenswrapper[17644]: I0319 12:09:50.358762 17644 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 12:09:50.359260 master-0 kubenswrapper[17644]: E0319 12:09:50.359217 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dd3d3608fe9c86b0f65904ec2353df4" containerName="kube-scheduler" Mar 19 12:09:50.359260 master-0 kubenswrapper[17644]: I0319 12:09:50.359252 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dd3d3608fe9c86b0f65904ec2353df4" containerName="kube-scheduler" Mar 19 12:09:50.359327 master-0 kubenswrapper[17644]: E0319 12:09:50.359272 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="kube-rbac-proxy-web" Mar 19 12:09:50.359327 master-0 kubenswrapper[17644]: I0319 12:09:50.359282 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="kube-rbac-proxy-web" Mar 19 12:09:50.359327 master-0 kubenswrapper[17644]: E0319 12:09:50.359315 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="kube-rbac-proxy-metric" Mar 19 12:09:50.359327 master-0 kubenswrapper[17644]: I0319 12:09:50.359323 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="kube-rbac-proxy-metric" Mar 19 12:09:50.359480 master-0 kubenswrapper[17644]: E0319 12:09:50.359339 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="config-reloader" Mar 19 12:09:50.359480 master-0 kubenswrapper[17644]: I0319 12:09:50.359348 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="config-reloader" Mar 19 12:09:50.359480 master-0 kubenswrapper[17644]: E0319 12:09:50.359370 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e43eccd-712d-459a-92af-5c2e900409c0" containerName="oauth-openshift" Mar 19 12:09:50.359480 master-0 kubenswrapper[17644]: I0319 12:09:50.359377 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e43eccd-712d-459a-92af-5c2e900409c0" containerName="oauth-openshift" Mar 19 12:09:50.359480 master-0 kubenswrapper[17644]: E0319 12:09:50.359393 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="init-config-reloader" Mar 19 12:09:50.359480 master-0 kubenswrapper[17644]: I0319 12:09:50.359401 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="init-config-reloader" Mar 19 12:09:50.359480 master-0 kubenswrapper[17644]: E0319 12:09:50.359423 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dd3d3608fe9c86b0f65904ec2353df4" containerName="wait-for-host-port" Mar 19 12:09:50.359480 master-0 kubenswrapper[17644]: I0319 12:09:50.359432 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dd3d3608fe9c86b0f65904ec2353df4" containerName="wait-for-host-port" Mar 19 12:09:50.359480 master-0 kubenswrapper[17644]: E0319 12:09:50.359444 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dd3d3608fe9c86b0f65904ec2353df4" containerName="kube-scheduler-cert-syncer" Mar 19 12:09:50.359480 master-0 kubenswrapper[17644]: I0319 12:09:50.359451 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dd3d3608fe9c86b0f65904ec2353df4" containerName="kube-scheduler-cert-syncer" Mar 19 12:09:50.359480 master-0 kubenswrapper[17644]: E0319 12:09:50.359462 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="prom-label-proxy" Mar 19 12:09:50.359480 master-0 kubenswrapper[17644]: I0319 12:09:50.359469 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="prom-label-proxy" Mar 19 12:09:50.359480 master-0 kubenswrapper[17644]: E0319 12:09:50.359477 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f636deab-3372-4c36-b492-df2442da1e31" containerName="console" Mar 19 12:09:50.359480 master-0 kubenswrapper[17644]: I0319 12:09:50.359485 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="f636deab-3372-4c36-b492-df2442da1e31" containerName="console" Mar 19 12:09:50.359480 master-0 kubenswrapper[17644]: E0319 12:09:50.359498 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="alertmanager" Mar 19 12:09:50.359925 master-0 kubenswrapper[17644]: I0319 12:09:50.359508 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="alertmanager" Mar 19 12:09:50.359925 master-0 kubenswrapper[17644]: E0319 12:09:50.359523 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dd3d3608fe9c86b0f65904ec2353df4" containerName="kube-scheduler-recovery-controller" Mar 19 12:09:50.359925 master-0 kubenswrapper[17644]: I0319 12:09:50.359532 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dd3d3608fe9c86b0f65904ec2353df4" containerName="kube-scheduler-recovery-controller" Mar 19 12:09:50.359925 master-0 kubenswrapper[17644]: E0319 12:09:50.359546 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="kube-rbac-proxy" Mar 19 12:09:50.359925 master-0 kubenswrapper[17644]: I0319 12:09:50.359553 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="kube-rbac-proxy" Mar 19 12:09:50.359925 master-0 kubenswrapper[17644]: I0319 12:09:50.359721 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="kube-rbac-proxy-web" Mar 19 12:09:50.359925 master-0 kubenswrapper[17644]: I0319 12:09:50.359791 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="prom-label-proxy" Mar 19 12:09:50.359925 master-0 kubenswrapper[17644]: I0319 12:09:50.359804 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dd3d3608fe9c86b0f65904ec2353df4" containerName="kube-scheduler" Mar 19 12:09:50.359925 master-0 kubenswrapper[17644]: I0319 12:09:50.359823 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="f636deab-3372-4c36-b492-df2442da1e31" containerName="console" Mar 19 12:09:50.359925 master-0 kubenswrapper[17644]: I0319 12:09:50.359842 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="config-reloader" Mar 19 12:09:50.359925 master-0 kubenswrapper[17644]: I0319 12:09:50.359858 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="kube-rbac-proxy-metric" Mar 19 12:09:50.359925 master-0 kubenswrapper[17644]: I0319 12:09:50.359872 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dd3d3608fe9c86b0f65904ec2353df4" containerName="kube-scheduler-cert-syncer" Mar 19 12:09:50.359925 master-0 kubenswrapper[17644]: I0319 12:09:50.359883 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dd3d3608fe9c86b0f65904ec2353df4" containerName="kube-scheduler-recovery-controller" Mar 19 12:09:50.359925 master-0 kubenswrapper[17644]: I0319 12:09:50.359896 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e43eccd-712d-459a-92af-5c2e900409c0" containerName="oauth-openshift" Mar 19 12:09:50.359925 master-0 kubenswrapper[17644]: I0319 12:09:50.359907 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="kube-rbac-proxy" Mar 19 12:09:50.359925 master-0 kubenswrapper[17644]: I0319 12:09:50.359917 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" containerName="alertmanager" Mar 19 12:09:50.361565 master-0 kubenswrapper[17644]: E0319 12:09:50.360078 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dd3d3608fe9c86b0f65904ec2353df4" containerName="kube-scheduler" Mar 19 12:09:50.361565 master-0 kubenswrapper[17644]: I0319 12:09:50.360116 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dd3d3608fe9c86b0f65904ec2353df4" containerName="kube-scheduler" Mar 19 12:09:50.361565 master-0 kubenswrapper[17644]: I0319 12:09:50.361100 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dd3d3608fe9c86b0f65904ec2353df4" containerName="kube-scheduler" Mar 19 12:09:50.493106 master-0 kubenswrapper[17644]: I0319 12:09:50.492325 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8" path="/var/lib/kubelet/pods/8f4cfb4b-ef6e-40d5-a1c1-fbebbece2dc8/volumes" Mar 19 12:09:50.519875 master-0 kubenswrapper[17644]: I0319 12:09:50.519584 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:09:50.519875 master-0 kubenswrapper[17644]: I0319 12:09:50.519676 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:09:50.551999 master-0 kubenswrapper[17644]: I0319 12:09:50.551936 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8dd3d3608fe9c86b0f65904ec2353df4/kube-scheduler-cert-syncer/0.log" Mar 19 12:09:50.552705 master-0 kubenswrapper[17644]: I0319 12:09:50.552662 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8dd3d3608fe9c86b0f65904ec2353df4/kube-scheduler/0.log" Mar 19 12:09:50.553309 master-0 kubenswrapper[17644]: I0319 12:09:50.553268 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:09:50.557402 master-0 kubenswrapper[17644]: I0319 12:09:50.557354 17644 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="8dd3d3608fe9c86b0f65904ec2353df4" podUID="8413125cf444e5c95f023c5dd9c6151e" Mar 19 12:09:50.621123 master-0 kubenswrapper[17644]: I0319 12:09:50.620988 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:09:50.621123 master-0 kubenswrapper[17644]: I0319 12:09:50.621099 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:09:50.621332 master-0 kubenswrapper[17644]: I0319 12:09:50.621127 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:09:50.621332 master-0 kubenswrapper[17644]: I0319 12:09:50.621150 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:09:50.722949 master-0 kubenswrapper[17644]: I0319 12:09:50.722816 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8dd3d3608fe9c86b0f65904ec2353df4-cert-dir\") pod \"8dd3d3608fe9c86b0f65904ec2353df4\" (UID: \"8dd3d3608fe9c86b0f65904ec2353df4\") " Mar 19 12:09:50.723256 master-0 kubenswrapper[17644]: I0319 12:09:50.723010 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8dd3d3608fe9c86b0f65904ec2353df4-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "8dd3d3608fe9c86b0f65904ec2353df4" (UID: "8dd3d3608fe9c86b0f65904ec2353df4"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:09:50.723256 master-0 kubenswrapper[17644]: I0319 12:09:50.723132 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8dd3d3608fe9c86b0f65904ec2353df4-resource-dir\") pod \"8dd3d3608fe9c86b0f65904ec2353df4\" (UID: \"8dd3d3608fe9c86b0f65904ec2353df4\") " Mar 19 12:09:50.723355 master-0 kubenswrapper[17644]: I0319 12:09:50.723141 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8dd3d3608fe9c86b0f65904ec2353df4-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "8dd3d3608fe9c86b0f65904ec2353df4" (UID: "8dd3d3608fe9c86b0f65904ec2353df4"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:09:50.724005 master-0 kubenswrapper[17644]: I0319 12:09:50.723961 17644 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8dd3d3608fe9c86b0f65904ec2353df4-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:50.724077 master-0 kubenswrapper[17644]: I0319 12:09:50.724008 17644 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8dd3d3608fe9c86b0f65904ec2353df4-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:51.198986 master-0 kubenswrapper[17644]: I0319 12:09:51.198932 17644 generic.go:334] "Generic (PLEG): container finished" podID="e40539b3-c74d-45b8-8526-d25a3a41c336" containerID="2adfba7d45404824eb5262853f4842c457955e93a85077eac87789dcfd1a1821" exitCode=0 Mar 19 12:09:51.199328 master-0 kubenswrapper[17644]: I0319 12:09:51.199006 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"e40539b3-c74d-45b8-8526-d25a3a41c336","Type":"ContainerDied","Data":"2adfba7d45404824eb5262853f4842c457955e93a85077eac87789dcfd1a1821"} Mar 19 12:09:51.202049 master-0 kubenswrapper[17644]: I0319 12:09:51.202008 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8dd3d3608fe9c86b0f65904ec2353df4/kube-scheduler-cert-syncer/0.log" Mar 19 12:09:51.202574 master-0 kubenswrapper[17644]: I0319 12:09:51.202535 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8dd3d3608fe9c86b0f65904ec2353df4/kube-scheduler/0.log" Mar 19 12:09:51.203017 master-0 kubenswrapper[17644]: I0319 12:09:51.202969 17644 generic.go:334] "Generic (PLEG): container finished" podID="8dd3d3608fe9c86b0f65904ec2353df4" containerID="b3485db6aaddee4e4d797b661b5dd9b9c4e87879acbe3a148927365e2fb64f17" exitCode=0 Mar 19 12:09:51.203017 master-0 kubenswrapper[17644]: I0319 12:09:51.203006 17644 generic.go:334] "Generic (PLEG): container finished" podID="8dd3d3608fe9c86b0f65904ec2353df4" containerID="614b9a040714905a39e760a6cecb6220bb3a230fd76a3172673a8f3b177db464" exitCode=0 Mar 19 12:09:51.203138 master-0 kubenswrapper[17644]: I0319 12:09:51.203026 17644 generic.go:334] "Generic (PLEG): container finished" podID="8dd3d3608fe9c86b0f65904ec2353df4" containerID="6d72821ebade3dd7591d917fa0467249fc3edcd5dc0972864a90c11eda293f32" exitCode=2 Mar 19 12:09:51.203138 master-0 kubenswrapper[17644]: I0319 12:09:51.203052 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:09:51.203138 master-0 kubenswrapper[17644]: I0319 12:09:51.203070 17644 scope.go:117] "RemoveContainer" containerID="b3485db6aaddee4e4d797b661b5dd9b9c4e87879acbe3a148927365e2fb64f17" Mar 19 12:09:51.218465 master-0 kubenswrapper[17644]: I0319 12:09:51.218433 17644 scope.go:117] "RemoveContainer" containerID="614b9a040714905a39e760a6cecb6220bb3a230fd76a3172673a8f3b177db464" Mar 19 12:09:51.232588 master-0 kubenswrapper[17644]: I0319 12:09:51.232505 17644 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="8dd3d3608fe9c86b0f65904ec2353df4" podUID="8413125cf444e5c95f023c5dd9c6151e" Mar 19 12:09:51.237135 master-0 kubenswrapper[17644]: I0319 12:09:51.237084 17644 scope.go:117] "RemoveContainer" containerID="6d72821ebade3dd7591d917fa0467249fc3edcd5dc0972864a90c11eda293f32" Mar 19 12:09:51.258054 master-0 kubenswrapper[17644]: I0319 12:09:51.257992 17644 scope.go:117] "RemoveContainer" containerID="629d5b962178c5cc8cd88274e6fb264cf46c233c7dcebb5d39e1e6499d1fc8bf" Mar 19 12:09:51.271993 master-0 kubenswrapper[17644]: I0319 12:09:51.271958 17644 scope.go:117] "RemoveContainer" containerID="acfd1ce3103a5fa37df0519179075aa6fc37e5626660c372ef45915da61d7d3c" Mar 19 12:09:51.293896 master-0 kubenswrapper[17644]: I0319 12:09:51.293844 17644 scope.go:117] "RemoveContainer" containerID="b3485db6aaddee4e4d797b661b5dd9b9c4e87879acbe3a148927365e2fb64f17" Mar 19 12:09:51.295220 master-0 kubenswrapper[17644]: E0319 12:09:51.295167 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3485db6aaddee4e4d797b661b5dd9b9c4e87879acbe3a148927365e2fb64f17\": container with ID starting with b3485db6aaddee4e4d797b661b5dd9b9c4e87879acbe3a148927365e2fb64f17 not found: ID does not exist" containerID="b3485db6aaddee4e4d797b661b5dd9b9c4e87879acbe3a148927365e2fb64f17" Mar 19 12:09:51.295293 master-0 kubenswrapper[17644]: I0319 12:09:51.295236 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3485db6aaddee4e4d797b661b5dd9b9c4e87879acbe3a148927365e2fb64f17"} err="failed to get container status \"b3485db6aaddee4e4d797b661b5dd9b9c4e87879acbe3a148927365e2fb64f17\": rpc error: code = NotFound desc = could not find container \"b3485db6aaddee4e4d797b661b5dd9b9c4e87879acbe3a148927365e2fb64f17\": container with ID starting with b3485db6aaddee4e4d797b661b5dd9b9c4e87879acbe3a148927365e2fb64f17 not found: ID does not exist" Mar 19 12:09:51.295293 master-0 kubenswrapper[17644]: I0319 12:09:51.295283 17644 scope.go:117] "RemoveContainer" containerID="614b9a040714905a39e760a6cecb6220bb3a230fd76a3172673a8f3b177db464" Mar 19 12:09:51.298153 master-0 kubenswrapper[17644]: E0319 12:09:51.298093 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"614b9a040714905a39e760a6cecb6220bb3a230fd76a3172673a8f3b177db464\": container with ID starting with 614b9a040714905a39e760a6cecb6220bb3a230fd76a3172673a8f3b177db464 not found: ID does not exist" containerID="614b9a040714905a39e760a6cecb6220bb3a230fd76a3172673a8f3b177db464" Mar 19 12:09:51.298248 master-0 kubenswrapper[17644]: I0319 12:09:51.298164 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"614b9a040714905a39e760a6cecb6220bb3a230fd76a3172673a8f3b177db464"} err="failed to get container status \"614b9a040714905a39e760a6cecb6220bb3a230fd76a3172673a8f3b177db464\": rpc error: code = NotFound desc = could not find container \"614b9a040714905a39e760a6cecb6220bb3a230fd76a3172673a8f3b177db464\": container with ID starting with 614b9a040714905a39e760a6cecb6220bb3a230fd76a3172673a8f3b177db464 not found: ID does not exist" Mar 19 12:09:51.298248 master-0 kubenswrapper[17644]: I0319 12:09:51.298205 17644 scope.go:117] "RemoveContainer" containerID="6d72821ebade3dd7591d917fa0467249fc3edcd5dc0972864a90c11eda293f32" Mar 19 12:09:51.298576 master-0 kubenswrapper[17644]: E0319 12:09:51.298541 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d72821ebade3dd7591d917fa0467249fc3edcd5dc0972864a90c11eda293f32\": container with ID starting with 6d72821ebade3dd7591d917fa0467249fc3edcd5dc0972864a90c11eda293f32 not found: ID does not exist" containerID="6d72821ebade3dd7591d917fa0467249fc3edcd5dc0972864a90c11eda293f32" Mar 19 12:09:51.298636 master-0 kubenswrapper[17644]: I0319 12:09:51.298583 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d72821ebade3dd7591d917fa0467249fc3edcd5dc0972864a90c11eda293f32"} err="failed to get container status \"6d72821ebade3dd7591d917fa0467249fc3edcd5dc0972864a90c11eda293f32\": rpc error: code = NotFound desc = could not find container \"6d72821ebade3dd7591d917fa0467249fc3edcd5dc0972864a90c11eda293f32\": container with ID starting with 6d72821ebade3dd7591d917fa0467249fc3edcd5dc0972864a90c11eda293f32 not found: ID does not exist" Mar 19 12:09:51.298636 master-0 kubenswrapper[17644]: I0319 12:09:51.298606 17644 scope.go:117] "RemoveContainer" containerID="629d5b962178c5cc8cd88274e6fb264cf46c233c7dcebb5d39e1e6499d1fc8bf" Mar 19 12:09:51.298920 master-0 kubenswrapper[17644]: E0319 12:09:51.298888 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"629d5b962178c5cc8cd88274e6fb264cf46c233c7dcebb5d39e1e6499d1fc8bf\": container with ID starting with 629d5b962178c5cc8cd88274e6fb264cf46c233c7dcebb5d39e1e6499d1fc8bf not found: ID does not exist" containerID="629d5b962178c5cc8cd88274e6fb264cf46c233c7dcebb5d39e1e6499d1fc8bf" Mar 19 12:09:51.298984 master-0 kubenswrapper[17644]: I0319 12:09:51.298923 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"629d5b962178c5cc8cd88274e6fb264cf46c233c7dcebb5d39e1e6499d1fc8bf"} err="failed to get container status \"629d5b962178c5cc8cd88274e6fb264cf46c233c7dcebb5d39e1e6499d1fc8bf\": rpc error: code = NotFound desc = could not find container \"629d5b962178c5cc8cd88274e6fb264cf46c233c7dcebb5d39e1e6499d1fc8bf\": container with ID starting with 629d5b962178c5cc8cd88274e6fb264cf46c233c7dcebb5d39e1e6499d1fc8bf not found: ID does not exist" Mar 19 12:09:51.298984 master-0 kubenswrapper[17644]: I0319 12:09:51.298944 17644 scope.go:117] "RemoveContainer" containerID="acfd1ce3103a5fa37df0519179075aa6fc37e5626660c372ef45915da61d7d3c" Mar 19 12:09:51.299429 master-0 kubenswrapper[17644]: E0319 12:09:51.299400 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acfd1ce3103a5fa37df0519179075aa6fc37e5626660c372ef45915da61d7d3c\": container with ID starting with acfd1ce3103a5fa37df0519179075aa6fc37e5626660c372ef45915da61d7d3c not found: ID does not exist" containerID="acfd1ce3103a5fa37df0519179075aa6fc37e5626660c372ef45915da61d7d3c" Mar 19 12:09:51.299488 master-0 kubenswrapper[17644]: I0319 12:09:51.299425 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acfd1ce3103a5fa37df0519179075aa6fc37e5626660c372ef45915da61d7d3c"} err="failed to get container status \"acfd1ce3103a5fa37df0519179075aa6fc37e5626660c372ef45915da61d7d3c\": rpc error: code = NotFound desc = could not find container \"acfd1ce3103a5fa37df0519179075aa6fc37e5626660c372ef45915da61d7d3c\": container with ID starting with acfd1ce3103a5fa37df0519179075aa6fc37e5626660c372ef45915da61d7d3c not found: ID does not exist" Mar 19 12:09:51.299488 master-0 kubenswrapper[17644]: I0319 12:09:51.299440 17644 scope.go:117] "RemoveContainer" containerID="b3485db6aaddee4e4d797b661b5dd9b9c4e87879acbe3a148927365e2fb64f17" Mar 19 12:09:51.299941 master-0 kubenswrapper[17644]: I0319 12:09:51.299905 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3485db6aaddee4e4d797b661b5dd9b9c4e87879acbe3a148927365e2fb64f17"} err="failed to get container status \"b3485db6aaddee4e4d797b661b5dd9b9c4e87879acbe3a148927365e2fb64f17\": rpc error: code = NotFound desc = could not find container \"b3485db6aaddee4e4d797b661b5dd9b9c4e87879acbe3a148927365e2fb64f17\": container with ID starting with b3485db6aaddee4e4d797b661b5dd9b9c4e87879acbe3a148927365e2fb64f17 not found: ID does not exist" Mar 19 12:09:51.299941 master-0 kubenswrapper[17644]: I0319 12:09:51.299936 17644 scope.go:117] "RemoveContainer" containerID="614b9a040714905a39e760a6cecb6220bb3a230fd76a3172673a8f3b177db464" Mar 19 12:09:51.300259 master-0 kubenswrapper[17644]: I0319 12:09:51.300230 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"614b9a040714905a39e760a6cecb6220bb3a230fd76a3172673a8f3b177db464"} err="failed to get container status \"614b9a040714905a39e760a6cecb6220bb3a230fd76a3172673a8f3b177db464\": rpc error: code = NotFound desc = could not find container \"614b9a040714905a39e760a6cecb6220bb3a230fd76a3172673a8f3b177db464\": container with ID starting with 614b9a040714905a39e760a6cecb6220bb3a230fd76a3172673a8f3b177db464 not found: ID does not exist" Mar 19 12:09:51.300259 master-0 kubenswrapper[17644]: I0319 12:09:51.300253 17644 scope.go:117] "RemoveContainer" containerID="6d72821ebade3dd7591d917fa0467249fc3edcd5dc0972864a90c11eda293f32" Mar 19 12:09:51.300690 master-0 kubenswrapper[17644]: I0319 12:09:51.300642 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d72821ebade3dd7591d917fa0467249fc3edcd5dc0972864a90c11eda293f32"} err="failed to get container status \"6d72821ebade3dd7591d917fa0467249fc3edcd5dc0972864a90c11eda293f32\": rpc error: code = NotFound desc = could not find container \"6d72821ebade3dd7591d917fa0467249fc3edcd5dc0972864a90c11eda293f32\": container with ID starting with 6d72821ebade3dd7591d917fa0467249fc3edcd5dc0972864a90c11eda293f32 not found: ID does not exist" Mar 19 12:09:51.300690 master-0 kubenswrapper[17644]: I0319 12:09:51.300686 17644 scope.go:117] "RemoveContainer" containerID="629d5b962178c5cc8cd88274e6fb264cf46c233c7dcebb5d39e1e6499d1fc8bf" Mar 19 12:09:51.302888 master-0 kubenswrapper[17644]: I0319 12:09:51.301049 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"629d5b962178c5cc8cd88274e6fb264cf46c233c7dcebb5d39e1e6499d1fc8bf"} err="failed to get container status \"629d5b962178c5cc8cd88274e6fb264cf46c233c7dcebb5d39e1e6499d1fc8bf\": rpc error: code = NotFound desc = could not find container \"629d5b962178c5cc8cd88274e6fb264cf46c233c7dcebb5d39e1e6499d1fc8bf\": container with ID starting with 629d5b962178c5cc8cd88274e6fb264cf46c233c7dcebb5d39e1e6499d1fc8bf not found: ID does not exist" Mar 19 12:09:51.302888 master-0 kubenswrapper[17644]: I0319 12:09:51.301076 17644 scope.go:117] "RemoveContainer" containerID="acfd1ce3103a5fa37df0519179075aa6fc37e5626660c372ef45915da61d7d3c" Mar 19 12:09:51.302888 master-0 kubenswrapper[17644]: I0319 12:09:51.301363 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acfd1ce3103a5fa37df0519179075aa6fc37e5626660c372ef45915da61d7d3c"} err="failed to get container status \"acfd1ce3103a5fa37df0519179075aa6fc37e5626660c372ef45915da61d7d3c\": rpc error: code = NotFound desc = could not find container \"acfd1ce3103a5fa37df0519179075aa6fc37e5626660c372ef45915da61d7d3c\": container with ID starting with acfd1ce3103a5fa37df0519179075aa6fc37e5626660c372ef45915da61d7d3c not found: ID does not exist" Mar 19 12:09:51.302888 master-0 kubenswrapper[17644]: I0319 12:09:51.301415 17644 scope.go:117] "RemoveContainer" containerID="b3485db6aaddee4e4d797b661b5dd9b9c4e87879acbe3a148927365e2fb64f17" Mar 19 12:09:51.302888 master-0 kubenswrapper[17644]: I0319 12:09:51.301756 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3485db6aaddee4e4d797b661b5dd9b9c4e87879acbe3a148927365e2fb64f17"} err="failed to get container status \"b3485db6aaddee4e4d797b661b5dd9b9c4e87879acbe3a148927365e2fb64f17\": rpc error: code = NotFound desc = could not find container \"b3485db6aaddee4e4d797b661b5dd9b9c4e87879acbe3a148927365e2fb64f17\": container with ID starting with b3485db6aaddee4e4d797b661b5dd9b9c4e87879acbe3a148927365e2fb64f17 not found: ID does not exist" Mar 19 12:09:51.302888 master-0 kubenswrapper[17644]: I0319 12:09:51.301774 17644 scope.go:117] "RemoveContainer" containerID="614b9a040714905a39e760a6cecb6220bb3a230fd76a3172673a8f3b177db464" Mar 19 12:09:51.302888 master-0 kubenswrapper[17644]: I0319 12:09:51.302044 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"614b9a040714905a39e760a6cecb6220bb3a230fd76a3172673a8f3b177db464"} err="failed to get container status \"614b9a040714905a39e760a6cecb6220bb3a230fd76a3172673a8f3b177db464\": rpc error: code = NotFound desc = could not find container \"614b9a040714905a39e760a6cecb6220bb3a230fd76a3172673a8f3b177db464\": container with ID starting with 614b9a040714905a39e760a6cecb6220bb3a230fd76a3172673a8f3b177db464 not found: ID does not exist" Mar 19 12:09:51.302888 master-0 kubenswrapper[17644]: I0319 12:09:51.302073 17644 scope.go:117] "RemoveContainer" containerID="6d72821ebade3dd7591d917fa0467249fc3edcd5dc0972864a90c11eda293f32" Mar 19 12:09:51.302888 master-0 kubenswrapper[17644]: I0319 12:09:51.302362 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d72821ebade3dd7591d917fa0467249fc3edcd5dc0972864a90c11eda293f32"} err="failed to get container status \"6d72821ebade3dd7591d917fa0467249fc3edcd5dc0972864a90c11eda293f32\": rpc error: code = NotFound desc = could not find container \"6d72821ebade3dd7591d917fa0467249fc3edcd5dc0972864a90c11eda293f32\": container with ID starting with 6d72821ebade3dd7591d917fa0467249fc3edcd5dc0972864a90c11eda293f32 not found: ID does not exist" Mar 19 12:09:51.302888 master-0 kubenswrapper[17644]: I0319 12:09:51.302382 17644 scope.go:117] "RemoveContainer" containerID="629d5b962178c5cc8cd88274e6fb264cf46c233c7dcebb5d39e1e6499d1fc8bf" Mar 19 12:09:51.302888 master-0 kubenswrapper[17644]: I0319 12:09:51.302624 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"629d5b962178c5cc8cd88274e6fb264cf46c233c7dcebb5d39e1e6499d1fc8bf"} err="failed to get container status \"629d5b962178c5cc8cd88274e6fb264cf46c233c7dcebb5d39e1e6499d1fc8bf\": rpc error: code = NotFound desc = could not find container \"629d5b962178c5cc8cd88274e6fb264cf46c233c7dcebb5d39e1e6499d1fc8bf\": container with ID starting with 629d5b962178c5cc8cd88274e6fb264cf46c233c7dcebb5d39e1e6499d1fc8bf not found: ID does not exist" Mar 19 12:09:51.302888 master-0 kubenswrapper[17644]: I0319 12:09:51.302644 17644 scope.go:117] "RemoveContainer" containerID="acfd1ce3103a5fa37df0519179075aa6fc37e5626660c372ef45915da61d7d3c" Mar 19 12:09:51.303332 master-0 kubenswrapper[17644]: I0319 12:09:51.303070 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acfd1ce3103a5fa37df0519179075aa6fc37e5626660c372ef45915da61d7d3c"} err="failed to get container status \"acfd1ce3103a5fa37df0519179075aa6fc37e5626660c372ef45915da61d7d3c\": rpc error: code = NotFound desc = could not find container \"acfd1ce3103a5fa37df0519179075aa6fc37e5626660c372ef45915da61d7d3c\": container with ID starting with acfd1ce3103a5fa37df0519179075aa6fc37e5626660c372ef45915da61d7d3c not found: ID does not exist" Mar 19 12:09:51.383937 master-0 kubenswrapper[17644]: I0319 12:09:51.383879 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 12:09:51.384440 master-0 kubenswrapper[17644]: I0319 12:09:51.384239 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="prometheus" containerID="cri-o://39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd" gracePeriod=600 Mar 19 12:09:51.384440 master-0 kubenswrapper[17644]: I0319 12:09:51.384280 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="kube-rbac-proxy-thanos" containerID="cri-o://9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19" gracePeriod=600 Mar 19 12:09:51.384440 master-0 kubenswrapper[17644]: I0319 12:09:51.384361 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="kube-rbac-proxy-web" containerID="cri-o://d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698" gracePeriod=600 Mar 19 12:09:51.384440 master-0 kubenswrapper[17644]: I0319 12:09:51.384361 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="kube-rbac-proxy" containerID="cri-o://f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0" gracePeriod=600 Mar 19 12:09:51.384575 master-0 kubenswrapper[17644]: I0319 12:09:51.384453 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="thanos-sidecar" containerID="cri-o://859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e" gracePeriod=600 Mar 19 12:09:51.384575 master-0 kubenswrapper[17644]: I0319 12:09:51.384382 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="config-reloader" containerID="cri-o://c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce" gracePeriod=600 Mar 19 12:09:51.818350 master-0 kubenswrapper[17644]: I0319 12:09:51.818296 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:09:51.841775 master-0 kubenswrapper[17644]: I0319 12:09:51.841693 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-k8s-db\") pod \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " Mar 19 12:09:51.841775 master-0 kubenswrapper[17644]: I0319 12:09:51.841765 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-kube-rbac-proxy\") pod \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " Mar 19 12:09:51.842040 master-0 kubenswrapper[17644]: I0319 12:09:51.841802 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-configmap-serving-certs-ca-bundle\") pod \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " Mar 19 12:09:51.842040 master-0 kubenswrapper[17644]: I0319 12:09:51.841833 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-tls-assets\") pod \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " Mar 19 12:09:51.842371 master-0 kubenswrapper[17644]: I0319 12:09:51.842314 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" (UID: "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:09:51.842445 master-0 kubenswrapper[17644]: I0319 12:09:51.842425 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " Mar 19 12:09:51.842492 master-0 kubenswrapper[17644]: I0319 12:09:51.842473 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-grpc-tls\") pod \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " Mar 19 12:09:51.842541 master-0 kubenswrapper[17644]: I0319 12:09:51.842512 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-config\") pod \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " Mar 19 12:09:51.842541 master-0 kubenswrapper[17644]: I0319 12:09:51.842537 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " Mar 19 12:09:51.842625 master-0 kubenswrapper[17644]: I0319 12:09:51.842570 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-configmap-kubelet-serving-ca-bundle\") pod \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " Mar 19 12:09:51.842625 master-0 kubenswrapper[17644]: I0319 12:09:51.842610 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzwsl\" (UniqueName: \"kubernetes.io/projected/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-kube-api-access-gzwsl\") pod \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " Mar 19 12:09:51.842703 master-0 kubenswrapper[17644]: I0319 12:09:51.842660 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-metrics-client-certs\") pod \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " Mar 19 12:09:51.842703 master-0 kubenswrapper[17644]: I0319 12:09:51.842687 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-trusted-ca-bundle\") pod \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " Mar 19 12:09:51.842822 master-0 kubenswrapper[17644]: I0319 12:09:51.842800 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-configmap-metrics-client-ca\") pod \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " Mar 19 12:09:51.842871 master-0 kubenswrapper[17644]: I0319 12:09:51.842846 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-prometheus-k8s-tls\") pod \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " Mar 19 12:09:51.842917 master-0 kubenswrapper[17644]: I0319 12:09:51.842877 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-k8s-rulefiles-0\") pod \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " Mar 19 12:09:51.842960 master-0 kubenswrapper[17644]: I0319 12:09:51.842915 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-web-config\") pod \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " Mar 19 12:09:51.842960 master-0 kubenswrapper[17644]: I0319 12:09:51.842944 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-thanos-prometheus-http-client-file\") pod \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " Mar 19 12:09:51.843047 master-0 kubenswrapper[17644]: I0319 12:09:51.843000 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-config-out\") pod \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\" (UID: \"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e\") " Mar 19 12:09:51.843315 master-0 kubenswrapper[17644]: I0319 12:09:51.843278 17644 reconciler_common.go:293] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-configmap-serving-certs-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:51.848960 master-0 kubenswrapper[17644]: I0319 12:09:51.844886 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" (UID: "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:09:51.848960 master-0 kubenswrapper[17644]: I0319 12:09:51.845066 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" (UID: "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:09:51.848960 master-0 kubenswrapper[17644]: I0319 12:09:51.845099 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" (UID: "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:09:51.848960 master-0 kubenswrapper[17644]: I0319 12:09:51.845122 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" (UID: "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:09:51.848960 master-0 kubenswrapper[17644]: I0319 12:09:51.846642 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" (UID: "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:09:51.848960 master-0 kubenswrapper[17644]: I0319 12:09:51.847511 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" (UID: "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:09:51.848960 master-0 kubenswrapper[17644]: I0319 12:09:51.847878 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" (UID: "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:09:51.848960 master-0 kubenswrapper[17644]: I0319 12:09:51.848881 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" (UID: "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:09:51.849928 master-0 kubenswrapper[17644]: I0319 12:09:51.849899 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-kube-api-access-gzwsl" (OuterVolumeSpecName: "kube-api-access-gzwsl") pod "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" (UID: "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e"). InnerVolumeSpecName "kube-api-access-gzwsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:09:51.850043 master-0 kubenswrapper[17644]: I0319 12:09:51.849963 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" (UID: "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:09:51.855189 master-0 kubenswrapper[17644]: I0319 12:09:51.851898 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" (UID: "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:09:51.855189 master-0 kubenswrapper[17644]: I0319 12:09:51.855119 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-config" (OuterVolumeSpecName: "config") pod "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" (UID: "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:09:51.858677 master-0 kubenswrapper[17644]: I0319 12:09:51.857720 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" (UID: "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:09:51.870834 master-0 kubenswrapper[17644]: I0319 12:09:51.860072 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" (UID: "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:09:51.870834 master-0 kubenswrapper[17644]: I0319 12:09:51.860113 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" (UID: "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:09:51.870834 master-0 kubenswrapper[17644]: I0319 12:09:51.860089 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-config-out" (OuterVolumeSpecName: "config-out") pod "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" (UID: "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:09:51.906984 master-0 kubenswrapper[17644]: I0319 12:09:51.903042 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-web-config" (OuterVolumeSpecName: "web-config") pod "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" (UID: "5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:09:51.951756 master-0 kubenswrapper[17644]: I0319 12:09:51.951606 17644 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:51.951756 master-0 kubenswrapper[17644]: I0319 12:09:51.951666 17644 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:51.951756 master-0 kubenswrapper[17644]: I0319 12:09:51.951681 17644 reconciler_common.go:293] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-configmap-kubelet-serving-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:51.951756 master-0 kubenswrapper[17644]: I0319 12:09:51.951697 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzwsl\" (UniqueName: \"kubernetes.io/projected/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-kube-api-access-gzwsl\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:51.951756 master-0 kubenswrapper[17644]: I0319 12:09:51.951710 17644 reconciler_common.go:293] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:51.951756 master-0 kubenswrapper[17644]: I0319 12:09:51.951740 17644 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-metrics-client-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:51.951756 master-0 kubenswrapper[17644]: I0319 12:09:51.951755 17644 reconciler_common.go:293] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-configmap-metrics-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:51.951756 master-0 kubenswrapper[17644]: I0319 12:09:51.951768 17644 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-prometheus-k8s-tls\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:51.951756 master-0 kubenswrapper[17644]: I0319 12:09:51.951785 17644 reconciler_common.go:293] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-k8s-rulefiles-0\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:51.952205 master-0 kubenswrapper[17644]: I0319 12:09:51.951802 17644 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-web-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:51.952205 master-0 kubenswrapper[17644]: I0319 12:09:51.951815 17644 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-thanos-prometheus-http-client-file\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:51.952205 master-0 kubenswrapper[17644]: I0319 12:09:51.951827 17644 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-config-out\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:51.952205 master-0 kubenswrapper[17644]: I0319 12:09:51.951844 17644 reconciler_common.go:293] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-prometheus-k8s-db\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:51.952205 master-0 kubenswrapper[17644]: I0319 12:09:51.951858 17644 reconciler_common.go:293] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-kube-rbac-proxy\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:51.952205 master-0 kubenswrapper[17644]: I0319 12:09:51.951870 17644 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-tls-assets\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:51.952205 master-0 kubenswrapper[17644]: I0319 12:09:51.951882 17644 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:51.952205 master-0 kubenswrapper[17644]: I0319 12:09:51.951894 17644 reconciler_common.go:293] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e-secret-grpc-tls\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:52.062078 master-0 kubenswrapper[17644]: I0319 12:09:52.061962 17644 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 12:09:52.063139 master-0 kubenswrapper[17644]: I0319 12:09:52.062264 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="607a35c2a34325129014a178207e606c" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://1700c51a0be3b8389e42a5cf379351f4fa21a1a23cc74be2e934a716c3897cd0" gracePeriod=30 Mar 19 12:09:52.063139 master-0 kubenswrapper[17644]: I0319 12:09:52.062430 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="607a35c2a34325129014a178207e606c" containerName="cluster-policy-controller" containerID="cri-o://a1820a5c08e897a9a826fb2120795cc3a6c64a34860ea5d00dda9abbdf9766f3" gracePeriod=30 Mar 19 12:09:52.063139 master-0 kubenswrapper[17644]: I0319 12:09:52.062484 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="607a35c2a34325129014a178207e606c" containerName="kube-controller-manager" containerID="cri-o://b444dcaee3b4ca7a60c29c3343ca436c90a224a2cac7695b9a98404124c21d5b" gracePeriod=30 Mar 19 12:09:52.063139 master-0 kubenswrapper[17644]: I0319 12:09:52.062514 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="607a35c2a34325129014a178207e606c" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://6df3457295116a2e9643f9aa93c1bc33230ddf9f1366aab4d64dcdaedbded1b4" gracePeriod=30 Mar 19 12:09:52.065272 master-0 kubenswrapper[17644]: I0319 12:09:52.065224 17644 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 12:09:52.065655 master-0 kubenswrapper[17644]: E0319 12:09:52.065596 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="607a35c2a34325129014a178207e606c" containerName="cluster-policy-controller" Mar 19 12:09:52.065655 master-0 kubenswrapper[17644]: I0319 12:09:52.065618 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="607a35c2a34325129014a178207e606c" containerName="cluster-policy-controller" Mar 19 12:09:52.065655 master-0 kubenswrapper[17644]: E0319 12:09:52.065631 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="607a35c2a34325129014a178207e606c" containerName="kube-controller-manager" Mar 19 12:09:52.065655 master-0 kubenswrapper[17644]: I0319 12:09:52.065639 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="607a35c2a34325129014a178207e606c" containerName="kube-controller-manager" Mar 19 12:09:52.065655 master-0 kubenswrapper[17644]: E0319 12:09:52.065647 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="kube-rbac-proxy-thanos" Mar 19 12:09:52.065655 master-0 kubenswrapper[17644]: I0319 12:09:52.065655 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="kube-rbac-proxy-thanos" Mar 19 12:09:52.066023 master-0 kubenswrapper[17644]: E0319 12:09:52.065681 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="607a35c2a34325129014a178207e606c" containerName="cluster-policy-controller" Mar 19 12:09:52.066023 master-0 kubenswrapper[17644]: I0319 12:09:52.065695 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="607a35c2a34325129014a178207e606c" containerName="cluster-policy-controller" Mar 19 12:09:52.066023 master-0 kubenswrapper[17644]: E0319 12:09:52.065708 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="607a35c2a34325129014a178207e606c" containerName="kube-controller-manager-recovery-controller" Mar 19 12:09:52.066023 master-0 kubenswrapper[17644]: I0319 12:09:52.065716 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="607a35c2a34325129014a178207e606c" containerName="kube-controller-manager-recovery-controller" Mar 19 12:09:52.066023 master-0 kubenswrapper[17644]: E0319 12:09:52.065750 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="prometheus" Mar 19 12:09:52.066023 master-0 kubenswrapper[17644]: I0319 12:09:52.065759 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="prometheus" Mar 19 12:09:52.066023 master-0 kubenswrapper[17644]: E0319 12:09:52.065771 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="kube-rbac-proxy" Mar 19 12:09:52.066023 master-0 kubenswrapper[17644]: I0319 12:09:52.065780 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="kube-rbac-proxy" Mar 19 12:09:52.066023 master-0 kubenswrapper[17644]: E0319 12:09:52.065792 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="init-config-reloader" Mar 19 12:09:52.066023 master-0 kubenswrapper[17644]: I0319 12:09:52.065800 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="init-config-reloader" Mar 19 12:09:52.066023 master-0 kubenswrapper[17644]: E0319 12:09:52.065809 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="607a35c2a34325129014a178207e606c" containerName="kube-controller-manager" Mar 19 12:09:52.066023 master-0 kubenswrapper[17644]: I0319 12:09:52.065819 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="607a35c2a34325129014a178207e606c" containerName="kube-controller-manager" Mar 19 12:09:52.066023 master-0 kubenswrapper[17644]: E0319 12:09:52.065832 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="kube-rbac-proxy-web" Mar 19 12:09:52.066023 master-0 kubenswrapper[17644]: I0319 12:09:52.065841 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="kube-rbac-proxy-web" Mar 19 12:09:52.066023 master-0 kubenswrapper[17644]: E0319 12:09:52.065853 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="thanos-sidecar" Mar 19 12:09:52.066023 master-0 kubenswrapper[17644]: I0319 12:09:52.065861 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="thanos-sidecar" Mar 19 12:09:52.066023 master-0 kubenswrapper[17644]: E0319 12:09:52.065873 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="607a35c2a34325129014a178207e606c" containerName="cluster-policy-controller" Mar 19 12:09:52.066023 master-0 kubenswrapper[17644]: I0319 12:09:52.065880 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="607a35c2a34325129014a178207e606c" containerName="cluster-policy-controller" Mar 19 12:09:52.066023 master-0 kubenswrapper[17644]: E0319 12:09:52.065892 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="config-reloader" Mar 19 12:09:52.066023 master-0 kubenswrapper[17644]: I0319 12:09:52.065900 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="config-reloader" Mar 19 12:09:52.066023 master-0 kubenswrapper[17644]: E0319 12:09:52.065916 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="607a35c2a34325129014a178207e606c" containerName="kube-controller-manager-cert-syncer" Mar 19 12:09:52.066023 master-0 kubenswrapper[17644]: I0319 12:09:52.065924 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="607a35c2a34325129014a178207e606c" containerName="kube-controller-manager-cert-syncer" Mar 19 12:09:52.066800 master-0 kubenswrapper[17644]: I0319 12:09:52.066074 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="607a35c2a34325129014a178207e606c" containerName="kube-controller-manager" Mar 19 12:09:52.066800 master-0 kubenswrapper[17644]: I0319 12:09:52.066094 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="config-reloader" Mar 19 12:09:52.066800 master-0 kubenswrapper[17644]: I0319 12:09:52.066118 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="kube-rbac-proxy" Mar 19 12:09:52.066800 master-0 kubenswrapper[17644]: I0319 12:09:52.066133 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="prometheus" Mar 19 12:09:52.066800 master-0 kubenswrapper[17644]: I0319 12:09:52.066145 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="607a35c2a34325129014a178207e606c" containerName="cluster-policy-controller" Mar 19 12:09:52.066800 master-0 kubenswrapper[17644]: I0319 12:09:52.066155 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="607a35c2a34325129014a178207e606c" containerName="cluster-policy-controller" Mar 19 12:09:52.066800 master-0 kubenswrapper[17644]: I0319 12:09:52.066171 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="607a35c2a34325129014a178207e606c" containerName="kube-controller-manager-recovery-controller" Mar 19 12:09:52.066800 master-0 kubenswrapper[17644]: I0319 12:09:52.066187 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="607a35c2a34325129014a178207e606c" containerName="kube-controller-manager-cert-syncer" Mar 19 12:09:52.066800 master-0 kubenswrapper[17644]: I0319 12:09:52.066201 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="kube-rbac-proxy-web" Mar 19 12:09:52.066800 master-0 kubenswrapper[17644]: I0319 12:09:52.066216 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="kube-rbac-proxy-thanos" Mar 19 12:09:52.066800 master-0 kubenswrapper[17644]: I0319 12:09:52.066228 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerName="thanos-sidecar" Mar 19 12:09:52.066800 master-0 kubenswrapper[17644]: I0319 12:09:52.066494 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="607a35c2a34325129014a178207e606c" containerName="cluster-policy-controller" Mar 19 12:09:52.066800 master-0 kubenswrapper[17644]: I0319 12:09:52.066521 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="607a35c2a34325129014a178207e606c" containerName="kube-controller-manager" Mar 19 12:09:52.154356 master-0 kubenswrapper[17644]: I0319 12:09:52.154297 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cd89f86c9be90c18d6ac0ac77e416132-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"cd89f86c9be90c18d6ac0ac77e416132\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:09:52.154493 master-0 kubenswrapper[17644]: I0319 12:09:52.154452 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cd89f86c9be90c18d6ac0ac77e416132-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"cd89f86c9be90c18d6ac0ac77e416132\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:09:52.212554 master-0 kubenswrapper[17644]: I0319 12:09:52.212489 17644 generic.go:334] "Generic (PLEG): container finished" podID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerID="9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19" exitCode=0 Mar 19 12:09:52.212554 master-0 kubenswrapper[17644]: I0319 12:09:52.212528 17644 generic.go:334] "Generic (PLEG): container finished" podID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerID="f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0" exitCode=0 Mar 19 12:09:52.212554 master-0 kubenswrapper[17644]: I0319 12:09:52.212536 17644 generic.go:334] "Generic (PLEG): container finished" podID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerID="d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698" exitCode=0 Mar 19 12:09:52.212554 master-0 kubenswrapper[17644]: I0319 12:09:52.212543 17644 generic.go:334] "Generic (PLEG): container finished" podID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerID="859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e" exitCode=0 Mar 19 12:09:52.212554 master-0 kubenswrapper[17644]: I0319 12:09:52.212551 17644 generic.go:334] "Generic (PLEG): container finished" podID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerID="c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce" exitCode=0 Mar 19 12:09:52.212554 master-0 kubenswrapper[17644]: I0319 12:09:52.212557 17644 generic.go:334] "Generic (PLEG): container finished" podID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" containerID="39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd" exitCode=0 Mar 19 12:09:52.212951 master-0 kubenswrapper[17644]: I0319 12:09:52.212602 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e","Type":"ContainerDied","Data":"9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19"} Mar 19 12:09:52.212951 master-0 kubenswrapper[17644]: I0319 12:09:52.212630 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e","Type":"ContainerDied","Data":"f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0"} Mar 19 12:09:52.212951 master-0 kubenswrapper[17644]: I0319 12:09:52.212642 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e","Type":"ContainerDied","Data":"d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698"} Mar 19 12:09:52.212951 master-0 kubenswrapper[17644]: I0319 12:09:52.212651 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e","Type":"ContainerDied","Data":"859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e"} Mar 19 12:09:52.212951 master-0 kubenswrapper[17644]: I0319 12:09:52.212663 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e","Type":"ContainerDied","Data":"c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce"} Mar 19 12:09:52.212951 master-0 kubenswrapper[17644]: I0319 12:09:52.212672 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e","Type":"ContainerDied","Data":"39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd"} Mar 19 12:09:52.212951 master-0 kubenswrapper[17644]: I0319 12:09:52.212682 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e","Type":"ContainerDied","Data":"5c7857ea1a010de91399569238d9a1c4c44df25ba2ca42c7577db11326546d38"} Mar 19 12:09:52.212951 master-0 kubenswrapper[17644]: I0319 12:09:52.212697 17644 scope.go:117] "RemoveContainer" containerID="9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19" Mar 19 12:09:52.212951 master-0 kubenswrapper[17644]: I0319 12:09:52.212831 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:09:52.215869 master-0 kubenswrapper[17644]: I0319 12:09:52.215828 17644 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="607a35c2a34325129014a178207e606c" podUID="cd89f86c9be90c18d6ac0ac77e416132" Mar 19 12:09:52.218083 master-0 kubenswrapper[17644]: I0319 12:09:52.218023 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_607a35c2a34325129014a178207e606c/cluster-policy-controller/1.log" Mar 19 12:09:52.219343 master-0 kubenswrapper[17644]: I0319 12:09:52.219283 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_607a35c2a34325129014a178207e606c/kube-controller-manager-cert-syncer/0.log" Mar 19 12:09:52.219877 master-0 kubenswrapper[17644]: I0319 12:09:52.219857 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_607a35c2a34325129014a178207e606c/kube-controller-manager/0.log" Mar 19 12:09:52.219960 master-0 kubenswrapper[17644]: I0319 12:09:52.219897 17644 generic.go:334] "Generic (PLEG): container finished" podID="607a35c2a34325129014a178207e606c" containerID="a1820a5c08e897a9a826fb2120795cc3a6c64a34860ea5d00dda9abbdf9766f3" exitCode=0 Mar 19 12:09:52.219960 master-0 kubenswrapper[17644]: I0319 12:09:52.219912 17644 generic.go:334] "Generic (PLEG): container finished" podID="607a35c2a34325129014a178207e606c" containerID="b444dcaee3b4ca7a60c29c3343ca436c90a224a2cac7695b9a98404124c21d5b" exitCode=0 Mar 19 12:09:52.219960 master-0 kubenswrapper[17644]: I0319 12:09:52.219919 17644 generic.go:334] "Generic (PLEG): container finished" podID="607a35c2a34325129014a178207e606c" containerID="6df3457295116a2e9643f9aa93c1bc33230ddf9f1366aab4d64dcdaedbded1b4" exitCode=0 Mar 19 12:09:52.219960 master-0 kubenswrapper[17644]: I0319 12:09:52.219926 17644 generic.go:334] "Generic (PLEG): container finished" podID="607a35c2a34325129014a178207e606c" containerID="1700c51a0be3b8389e42a5cf379351f4fa21a1a23cc74be2e934a716c3897cd0" exitCode=2 Mar 19 12:09:52.220163 master-0 kubenswrapper[17644]: I0319 12:09:52.220005 17644 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5e7e400236405f2886d239c6b9855e16c927caf360652a9a3fc0202e1c9146b" Mar 19 12:09:52.235023 master-0 kubenswrapper[17644]: I0319 12:09:52.234780 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_607a35c2a34325129014a178207e606c/cluster-policy-controller/1.log" Mar 19 12:09:52.236812 master-0 kubenswrapper[17644]: I0319 12:09:52.236782 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_607a35c2a34325129014a178207e606c/kube-controller-manager-cert-syncer/0.log" Mar 19 12:09:52.236992 master-0 kubenswrapper[17644]: I0319 12:09:52.236966 17644 scope.go:117] "RemoveContainer" containerID="f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0" Mar 19 12:09:52.237506 master-0 kubenswrapper[17644]: I0319 12:09:52.237466 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_607a35c2a34325129014a178207e606c/kube-controller-manager/0.log" Mar 19 12:09:52.237606 master-0 kubenswrapper[17644]: I0319 12:09:52.237575 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:09:52.255507 master-0 kubenswrapper[17644]: I0319 12:09:52.255411 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/607a35c2a34325129014a178207e606c-cert-dir\") pod \"607a35c2a34325129014a178207e606c\" (UID: \"607a35c2a34325129014a178207e606c\") " Mar 19 12:09:52.256075 master-0 kubenswrapper[17644]: I0319 12:09:52.255528 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/607a35c2a34325129014a178207e606c-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "607a35c2a34325129014a178207e606c" (UID: "607a35c2a34325129014a178207e606c"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:09:52.256075 master-0 kubenswrapper[17644]: I0319 12:09:52.255617 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/607a35c2a34325129014a178207e606c-resource-dir\") pod \"607a35c2a34325129014a178207e606c\" (UID: \"607a35c2a34325129014a178207e606c\") " Mar 19 12:09:52.256187 master-0 kubenswrapper[17644]: I0319 12:09:52.256097 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cd89f86c9be90c18d6ac0ac77e416132-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"cd89f86c9be90c18d6ac0ac77e416132\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:09:52.256187 master-0 kubenswrapper[17644]: I0319 12:09:52.256150 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cd89f86c9be90c18d6ac0ac77e416132-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"cd89f86c9be90c18d6ac0ac77e416132\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:09:52.256597 master-0 kubenswrapper[17644]: I0319 12:09:52.256565 17644 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/607a35c2a34325129014a178207e606c-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:52.256640 master-0 kubenswrapper[17644]: I0319 12:09:52.256626 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cd89f86c9be90c18d6ac0ac77e416132-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"cd89f86c9be90c18d6ac0ac77e416132\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:09:52.256719 master-0 kubenswrapper[17644]: I0319 12:09:52.256696 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/607a35c2a34325129014a178207e606c-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "607a35c2a34325129014a178207e606c" (UID: "607a35c2a34325129014a178207e606c"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:09:52.256811 master-0 kubenswrapper[17644]: I0319 12:09:52.256760 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cd89f86c9be90c18d6ac0ac77e416132-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"cd89f86c9be90c18d6ac0ac77e416132\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:09:52.258085 master-0 kubenswrapper[17644]: I0319 12:09:52.257957 17644 scope.go:117] "RemoveContainer" containerID="d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698" Mar 19 12:09:52.259467 master-0 kubenswrapper[17644]: I0319 12:09:52.259399 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 12:09:52.259714 master-0 kubenswrapper[17644]: I0319 12:09:52.259691 17644 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="607a35c2a34325129014a178207e606c" podUID="cd89f86c9be90c18d6ac0ac77e416132" Mar 19 12:09:52.264263 master-0 kubenswrapper[17644]: I0319 12:09:52.264180 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 12:09:52.272795 master-0 kubenswrapper[17644]: I0319 12:09:52.272698 17644 scope.go:117] "RemoveContainer" containerID="859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e" Mar 19 12:09:52.337827 master-0 kubenswrapper[17644]: I0319 12:09:52.337706 17644 scope.go:117] "RemoveContainer" containerID="c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce" Mar 19 12:09:52.354682 master-0 kubenswrapper[17644]: I0319 12:09:52.354626 17644 scope.go:117] "RemoveContainer" containerID="39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd" Mar 19 12:09:52.357693 master-0 kubenswrapper[17644]: I0319 12:09:52.357640 17644 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/607a35c2a34325129014a178207e606c-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:52.369499 master-0 kubenswrapper[17644]: I0319 12:09:52.369443 17644 scope.go:117] "RemoveContainer" containerID="e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71" Mar 19 12:09:52.383082 master-0 kubenswrapper[17644]: I0319 12:09:52.383036 17644 scope.go:117] "RemoveContainer" containerID="9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19" Mar 19 12:09:52.384194 master-0 kubenswrapper[17644]: E0319 12:09:52.384139 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19\": container with ID starting with 9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19 not found: ID does not exist" containerID="9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19" Mar 19 12:09:52.384550 master-0 kubenswrapper[17644]: I0319 12:09:52.384193 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19"} err="failed to get container status \"9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19\": rpc error: code = NotFound desc = could not find container \"9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19\": container with ID starting with 9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19 not found: ID does not exist" Mar 19 12:09:52.384550 master-0 kubenswrapper[17644]: I0319 12:09:52.384222 17644 scope.go:117] "RemoveContainer" containerID="f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0" Mar 19 12:09:52.384641 master-0 kubenswrapper[17644]: E0319 12:09:52.384543 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0\": container with ID starting with f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0 not found: ID does not exist" containerID="f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0" Mar 19 12:09:52.384641 master-0 kubenswrapper[17644]: I0319 12:09:52.384577 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0"} err="failed to get container status \"f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0\": rpc error: code = NotFound desc = could not find container \"f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0\": container with ID starting with f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0 not found: ID does not exist" Mar 19 12:09:52.384641 master-0 kubenswrapper[17644]: I0319 12:09:52.384619 17644 scope.go:117] "RemoveContainer" containerID="d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698" Mar 19 12:09:52.385161 master-0 kubenswrapper[17644]: E0319 12:09:52.385081 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698\": container with ID starting with d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698 not found: ID does not exist" containerID="d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698" Mar 19 12:09:52.385161 master-0 kubenswrapper[17644]: I0319 12:09:52.385104 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698"} err="failed to get container status \"d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698\": rpc error: code = NotFound desc = could not find container \"d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698\": container with ID starting with d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698 not found: ID does not exist" Mar 19 12:09:52.385161 master-0 kubenswrapper[17644]: I0319 12:09:52.385116 17644 scope.go:117] "RemoveContainer" containerID="859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e" Mar 19 12:09:52.385494 master-0 kubenswrapper[17644]: E0319 12:09:52.385454 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e\": container with ID starting with 859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e not found: ID does not exist" containerID="859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e" Mar 19 12:09:52.385537 master-0 kubenswrapper[17644]: I0319 12:09:52.385488 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e"} err="failed to get container status \"859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e\": rpc error: code = NotFound desc = could not find container \"859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e\": container with ID starting with 859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e not found: ID does not exist" Mar 19 12:09:52.385537 master-0 kubenswrapper[17644]: I0319 12:09:52.385507 17644 scope.go:117] "RemoveContainer" containerID="c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce" Mar 19 12:09:52.386828 master-0 kubenswrapper[17644]: E0319 12:09:52.386782 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce\": container with ID starting with c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce not found: ID does not exist" containerID="c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce" Mar 19 12:09:52.386828 master-0 kubenswrapper[17644]: I0319 12:09:52.386817 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce"} err="failed to get container status \"c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce\": rpc error: code = NotFound desc = could not find container \"c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce\": container with ID starting with c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce not found: ID does not exist" Mar 19 12:09:52.386955 master-0 kubenswrapper[17644]: I0319 12:09:52.386834 17644 scope.go:117] "RemoveContainer" containerID="39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd" Mar 19 12:09:52.387252 master-0 kubenswrapper[17644]: E0319 12:09:52.387217 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd\": container with ID starting with 39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd not found: ID does not exist" containerID="39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd" Mar 19 12:09:52.387252 master-0 kubenswrapper[17644]: I0319 12:09:52.387241 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd"} err="failed to get container status \"39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd\": rpc error: code = NotFound desc = could not find container \"39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd\": container with ID starting with 39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd not found: ID does not exist" Mar 19 12:09:52.387252 master-0 kubenswrapper[17644]: I0319 12:09:52.387254 17644 scope.go:117] "RemoveContainer" containerID="e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71" Mar 19 12:09:52.387624 master-0 kubenswrapper[17644]: E0319 12:09:52.387595 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71\": container with ID starting with e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71 not found: ID does not exist" containerID="e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71" Mar 19 12:09:52.387665 master-0 kubenswrapper[17644]: I0319 12:09:52.387620 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71"} err="failed to get container status \"e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71\": rpc error: code = NotFound desc = could not find container \"e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71\": container with ID starting with e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71 not found: ID does not exist" Mar 19 12:09:52.387665 master-0 kubenswrapper[17644]: I0319 12:09:52.387638 17644 scope.go:117] "RemoveContainer" containerID="9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19" Mar 19 12:09:52.388012 master-0 kubenswrapper[17644]: I0319 12:09:52.387975 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19"} err="failed to get container status \"9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19\": rpc error: code = NotFound desc = could not find container \"9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19\": container with ID starting with 9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19 not found: ID does not exist" Mar 19 12:09:52.388012 master-0 kubenswrapper[17644]: I0319 12:09:52.388004 17644 scope.go:117] "RemoveContainer" containerID="f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0" Mar 19 12:09:52.388680 master-0 kubenswrapper[17644]: I0319 12:09:52.388646 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0"} err="failed to get container status \"f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0\": rpc error: code = NotFound desc = could not find container \"f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0\": container with ID starting with f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0 not found: ID does not exist" Mar 19 12:09:52.388680 master-0 kubenswrapper[17644]: I0319 12:09:52.388670 17644 scope.go:117] "RemoveContainer" containerID="d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698" Mar 19 12:09:52.389106 master-0 kubenswrapper[17644]: I0319 12:09:52.389049 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698"} err="failed to get container status \"d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698\": rpc error: code = NotFound desc = could not find container \"d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698\": container with ID starting with d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698 not found: ID does not exist" Mar 19 12:09:52.389106 master-0 kubenswrapper[17644]: I0319 12:09:52.389083 17644 scope.go:117] "RemoveContainer" containerID="859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e" Mar 19 12:09:52.389379 master-0 kubenswrapper[17644]: I0319 12:09:52.389347 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e"} err="failed to get container status \"859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e\": rpc error: code = NotFound desc = could not find container \"859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e\": container with ID starting with 859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e not found: ID does not exist" Mar 19 12:09:52.389379 master-0 kubenswrapper[17644]: I0319 12:09:52.389369 17644 scope.go:117] "RemoveContainer" containerID="c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce" Mar 19 12:09:52.389657 master-0 kubenswrapper[17644]: I0319 12:09:52.389624 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce"} err="failed to get container status \"c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce\": rpc error: code = NotFound desc = could not find container \"c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce\": container with ID starting with c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce not found: ID does not exist" Mar 19 12:09:52.389657 master-0 kubenswrapper[17644]: I0319 12:09:52.389650 17644 scope.go:117] "RemoveContainer" containerID="39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd" Mar 19 12:09:52.390065 master-0 kubenswrapper[17644]: I0319 12:09:52.390037 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd"} err="failed to get container status \"39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd\": rpc error: code = NotFound desc = could not find container \"39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd\": container with ID starting with 39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd not found: ID does not exist" Mar 19 12:09:52.390065 master-0 kubenswrapper[17644]: I0319 12:09:52.390059 17644 scope.go:117] "RemoveContainer" containerID="e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71" Mar 19 12:09:52.390483 master-0 kubenswrapper[17644]: I0319 12:09:52.390452 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71"} err="failed to get container status \"e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71\": rpc error: code = NotFound desc = could not find container \"e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71\": container with ID starting with e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71 not found: ID does not exist" Mar 19 12:09:52.390483 master-0 kubenswrapper[17644]: I0319 12:09:52.390475 17644 scope.go:117] "RemoveContainer" containerID="9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19" Mar 19 12:09:52.390834 master-0 kubenswrapper[17644]: I0319 12:09:52.390804 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19"} err="failed to get container status \"9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19\": rpc error: code = NotFound desc = could not find container \"9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19\": container with ID starting with 9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19 not found: ID does not exist" Mar 19 12:09:52.390834 master-0 kubenswrapper[17644]: I0319 12:09:52.390825 17644 scope.go:117] "RemoveContainer" containerID="f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0" Mar 19 12:09:52.391052 master-0 kubenswrapper[17644]: I0319 12:09:52.391019 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0"} err="failed to get container status \"f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0\": rpc error: code = NotFound desc = could not find container \"f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0\": container with ID starting with f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0 not found: ID does not exist" Mar 19 12:09:52.391052 master-0 kubenswrapper[17644]: I0319 12:09:52.391046 17644 scope.go:117] "RemoveContainer" containerID="d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698" Mar 19 12:09:52.391413 master-0 kubenswrapper[17644]: I0319 12:09:52.391375 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698"} err="failed to get container status \"d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698\": rpc error: code = NotFound desc = could not find container \"d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698\": container with ID starting with d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698 not found: ID does not exist" Mar 19 12:09:52.391413 master-0 kubenswrapper[17644]: I0319 12:09:52.391395 17644 scope.go:117] "RemoveContainer" containerID="859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e" Mar 19 12:09:52.391717 master-0 kubenswrapper[17644]: I0319 12:09:52.391696 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e"} err="failed to get container status \"859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e\": rpc error: code = NotFound desc = could not find container \"859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e\": container with ID starting with 859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e not found: ID does not exist" Mar 19 12:09:52.391717 master-0 kubenswrapper[17644]: I0319 12:09:52.391714 17644 scope.go:117] "RemoveContainer" containerID="c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce" Mar 19 12:09:52.392077 master-0 kubenswrapper[17644]: I0319 12:09:52.392048 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce"} err="failed to get container status \"c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce\": rpc error: code = NotFound desc = could not find container \"c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce\": container with ID starting with c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce not found: ID does not exist" Mar 19 12:09:52.392077 master-0 kubenswrapper[17644]: I0319 12:09:52.392067 17644 scope.go:117] "RemoveContainer" containerID="39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd" Mar 19 12:09:52.392284 master-0 kubenswrapper[17644]: I0319 12:09:52.392248 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd"} err="failed to get container status \"39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd\": rpc error: code = NotFound desc = could not find container \"39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd\": container with ID starting with 39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd not found: ID does not exist" Mar 19 12:09:52.392284 master-0 kubenswrapper[17644]: I0319 12:09:52.392269 17644 scope.go:117] "RemoveContainer" containerID="e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71" Mar 19 12:09:52.392667 master-0 kubenswrapper[17644]: I0319 12:09:52.392598 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71"} err="failed to get container status \"e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71\": rpc error: code = NotFound desc = could not find container \"e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71\": container with ID starting with e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71 not found: ID does not exist" Mar 19 12:09:52.392667 master-0 kubenswrapper[17644]: I0319 12:09:52.392617 17644 scope.go:117] "RemoveContainer" containerID="9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19" Mar 19 12:09:52.392929 master-0 kubenswrapper[17644]: I0319 12:09:52.392885 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19"} err="failed to get container status \"9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19\": rpc error: code = NotFound desc = could not find container \"9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19\": container with ID starting with 9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19 not found: ID does not exist" Mar 19 12:09:52.392929 master-0 kubenswrapper[17644]: I0319 12:09:52.392913 17644 scope.go:117] "RemoveContainer" containerID="f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0" Mar 19 12:09:52.393467 master-0 kubenswrapper[17644]: I0319 12:09:52.393433 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0"} err="failed to get container status \"f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0\": rpc error: code = NotFound desc = could not find container \"f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0\": container with ID starting with f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0 not found: ID does not exist" Mar 19 12:09:52.393542 master-0 kubenswrapper[17644]: I0319 12:09:52.393468 17644 scope.go:117] "RemoveContainer" containerID="d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698" Mar 19 12:09:52.393976 master-0 kubenswrapper[17644]: I0319 12:09:52.393837 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698"} err="failed to get container status \"d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698\": rpc error: code = NotFound desc = could not find container \"d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698\": container with ID starting with d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698 not found: ID does not exist" Mar 19 12:09:52.393976 master-0 kubenswrapper[17644]: I0319 12:09:52.393858 17644 scope.go:117] "RemoveContainer" containerID="859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e" Mar 19 12:09:52.394259 master-0 kubenswrapper[17644]: I0319 12:09:52.394228 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e"} err="failed to get container status \"859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e\": rpc error: code = NotFound desc = could not find container \"859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e\": container with ID starting with 859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e not found: ID does not exist" Mar 19 12:09:52.394259 master-0 kubenswrapper[17644]: I0319 12:09:52.394256 17644 scope.go:117] "RemoveContainer" containerID="c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce" Mar 19 12:09:52.394582 master-0 kubenswrapper[17644]: I0319 12:09:52.394549 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce"} err="failed to get container status \"c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce\": rpc error: code = NotFound desc = could not find container \"c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce\": container with ID starting with c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce not found: ID does not exist" Mar 19 12:09:52.394582 master-0 kubenswrapper[17644]: I0319 12:09:52.394569 17644 scope.go:117] "RemoveContainer" containerID="39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd" Mar 19 12:09:52.394851 master-0 kubenswrapper[17644]: I0319 12:09:52.394820 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd"} err="failed to get container status \"39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd\": rpc error: code = NotFound desc = could not find container \"39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd\": container with ID starting with 39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd not found: ID does not exist" Mar 19 12:09:52.394851 master-0 kubenswrapper[17644]: I0319 12:09:52.394841 17644 scope.go:117] "RemoveContainer" containerID="e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71" Mar 19 12:09:52.395207 master-0 kubenswrapper[17644]: I0319 12:09:52.395178 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71"} err="failed to get container status \"e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71\": rpc error: code = NotFound desc = could not find container \"e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71\": container with ID starting with e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71 not found: ID does not exist" Mar 19 12:09:52.395207 master-0 kubenswrapper[17644]: I0319 12:09:52.395199 17644 scope.go:117] "RemoveContainer" containerID="9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19" Mar 19 12:09:52.395404 master-0 kubenswrapper[17644]: I0319 12:09:52.395376 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19"} err="failed to get container status \"9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19\": rpc error: code = NotFound desc = could not find container \"9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19\": container with ID starting with 9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19 not found: ID does not exist" Mar 19 12:09:52.395404 master-0 kubenswrapper[17644]: I0319 12:09:52.395394 17644 scope.go:117] "RemoveContainer" containerID="f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0" Mar 19 12:09:52.395828 master-0 kubenswrapper[17644]: I0319 12:09:52.395797 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0"} err="failed to get container status \"f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0\": rpc error: code = NotFound desc = could not find container \"f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0\": container with ID starting with f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0 not found: ID does not exist" Mar 19 12:09:52.395828 master-0 kubenswrapper[17644]: I0319 12:09:52.395819 17644 scope.go:117] "RemoveContainer" containerID="d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698" Mar 19 12:09:52.396166 master-0 kubenswrapper[17644]: I0319 12:09:52.396054 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698"} err="failed to get container status \"d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698\": rpc error: code = NotFound desc = could not find container \"d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698\": container with ID starting with d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698 not found: ID does not exist" Mar 19 12:09:52.396166 master-0 kubenswrapper[17644]: I0319 12:09:52.396069 17644 scope.go:117] "RemoveContainer" containerID="859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e" Mar 19 12:09:52.396405 master-0 kubenswrapper[17644]: I0319 12:09:52.396372 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e"} err="failed to get container status \"859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e\": rpc error: code = NotFound desc = could not find container \"859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e\": container with ID starting with 859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e not found: ID does not exist" Mar 19 12:09:52.396405 master-0 kubenswrapper[17644]: I0319 12:09:52.396394 17644 scope.go:117] "RemoveContainer" containerID="c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce" Mar 19 12:09:52.396860 master-0 kubenswrapper[17644]: I0319 12:09:52.396827 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce"} err="failed to get container status \"c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce\": rpc error: code = NotFound desc = could not find container \"c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce\": container with ID starting with c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce not found: ID does not exist" Mar 19 12:09:52.396860 master-0 kubenswrapper[17644]: I0319 12:09:52.396848 17644 scope.go:117] "RemoveContainer" containerID="39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd" Mar 19 12:09:52.397240 master-0 kubenswrapper[17644]: I0319 12:09:52.397200 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd"} err="failed to get container status \"39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd\": rpc error: code = NotFound desc = could not find container \"39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd\": container with ID starting with 39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd not found: ID does not exist" Mar 19 12:09:52.397240 master-0 kubenswrapper[17644]: I0319 12:09:52.397226 17644 scope.go:117] "RemoveContainer" containerID="e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71" Mar 19 12:09:52.397541 master-0 kubenswrapper[17644]: I0319 12:09:52.397510 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71"} err="failed to get container status \"e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71\": rpc error: code = NotFound desc = could not find container \"e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71\": container with ID starting with e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71 not found: ID does not exist" Mar 19 12:09:52.397541 master-0 kubenswrapper[17644]: I0319 12:09:52.397530 17644 scope.go:117] "RemoveContainer" containerID="9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19" Mar 19 12:09:52.397804 master-0 kubenswrapper[17644]: I0319 12:09:52.397776 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19"} err="failed to get container status \"9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19\": rpc error: code = NotFound desc = could not find container \"9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19\": container with ID starting with 9459211f53316aa73919d75445d0e4f59791e560a016846856fa94c6803e5f19 not found: ID does not exist" Mar 19 12:09:52.397804 master-0 kubenswrapper[17644]: I0319 12:09:52.397797 17644 scope.go:117] "RemoveContainer" containerID="f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0" Mar 19 12:09:52.398152 master-0 kubenswrapper[17644]: I0319 12:09:52.398121 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0"} err="failed to get container status \"f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0\": rpc error: code = NotFound desc = could not find container \"f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0\": container with ID starting with f323975864469698f3b4d66a0393d3d84882e2a8779b3cc53f6417b6cc5f27a0 not found: ID does not exist" Mar 19 12:09:52.398152 master-0 kubenswrapper[17644]: I0319 12:09:52.398143 17644 scope.go:117] "RemoveContainer" containerID="d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698" Mar 19 12:09:52.398378 master-0 kubenswrapper[17644]: I0319 12:09:52.398343 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698"} err="failed to get container status \"d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698\": rpc error: code = NotFound desc = could not find container \"d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698\": container with ID starting with d81a662123874954002627a7b2faeac3547cb815a7378f119911078f80dbd698 not found: ID does not exist" Mar 19 12:09:52.398378 master-0 kubenswrapper[17644]: I0319 12:09:52.398363 17644 scope.go:117] "RemoveContainer" containerID="859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e" Mar 19 12:09:52.398704 master-0 kubenswrapper[17644]: I0319 12:09:52.398673 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e"} err="failed to get container status \"859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e\": rpc error: code = NotFound desc = could not find container \"859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e\": container with ID starting with 859b59d6222af800df2404bb6c937521a410f26743baaa2d4b6336c98828999e not found: ID does not exist" Mar 19 12:09:52.398704 master-0 kubenswrapper[17644]: I0319 12:09:52.398693 17644 scope.go:117] "RemoveContainer" containerID="c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce" Mar 19 12:09:52.398977 master-0 kubenswrapper[17644]: I0319 12:09:52.398932 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce"} err="failed to get container status \"c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce\": rpc error: code = NotFound desc = could not find container \"c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce\": container with ID starting with c3789ba3c24e692bdc7e9e2d757b14004116e11dfafa08acc23e8d02b851a7ce not found: ID does not exist" Mar 19 12:09:52.398977 master-0 kubenswrapper[17644]: I0319 12:09:52.398966 17644 scope.go:117] "RemoveContainer" containerID="39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd" Mar 19 12:09:52.399312 master-0 kubenswrapper[17644]: I0319 12:09:52.399277 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd"} err="failed to get container status \"39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd\": rpc error: code = NotFound desc = could not find container \"39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd\": container with ID starting with 39692a6d7ddd88db28953990dfe16cfe2dfeac3f889d9aff29a69453d95d0bbd not found: ID does not exist" Mar 19 12:09:52.399312 master-0 kubenswrapper[17644]: I0319 12:09:52.399298 17644 scope.go:117] "RemoveContainer" containerID="e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71" Mar 19 12:09:52.399572 master-0 kubenswrapper[17644]: I0319 12:09:52.399539 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71"} err="failed to get container status \"e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71\": rpc error: code = NotFound desc = could not find container \"e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71\": container with ID starting with e08378f13d01b0b46903f0a67dd877359947750028c69fea7483b1d246852e71 not found: ID does not exist" Mar 19 12:09:52.399572 master-0 kubenswrapper[17644]: I0319 12:09:52.399561 17644 scope.go:117] "RemoveContainer" containerID="9d993cd1a5833f4165f6cc8d12eb8bf154b773e372dc7ee598d180665ac9fcd8" Mar 19 12:09:52.412368 master-0 kubenswrapper[17644]: I0319 12:09:52.412320 17644 scope.go:117] "RemoveContainer" containerID="ea2a9b69ce6100be8b7a5ab8a5b3754a3d903dde1f34f7b2ab132e5a43190e43" Mar 19 12:09:52.511464 master-0 kubenswrapper[17644]: I0319 12:09:52.511396 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e" path="/var/lib/kubelet/pods/5aee1f5a-01e6-47b5-91ac-4c77f2b0f54e/volumes" Mar 19 12:09:52.512407 master-0 kubenswrapper[17644]: I0319 12:09:52.512369 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="607a35c2a34325129014a178207e606c" path="/var/lib/kubelet/pods/607a35c2a34325129014a178207e606c/volumes" Mar 19 12:09:52.513593 master-0 kubenswrapper[17644]: I0319 12:09:52.513558 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dd3d3608fe9c86b0f65904ec2353df4" path="/var/lib/kubelet/pods/8dd3d3608fe9c86b0f65904ec2353df4/volumes" Mar 19 12:09:52.615028 master-0 kubenswrapper[17644]: I0319 12:09:52.614916 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 12:09:52.678958 master-0 kubenswrapper[17644]: I0319 12:09:52.678875 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e40539b3-c74d-45b8-8526-d25a3a41c336-kube-api-access\") pod \"e40539b3-c74d-45b8-8526-d25a3a41c336\" (UID: \"e40539b3-c74d-45b8-8526-d25a3a41c336\") " Mar 19 12:09:52.678958 master-0 kubenswrapper[17644]: I0319 12:09:52.678961 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e40539b3-c74d-45b8-8526-d25a3a41c336-kubelet-dir\") pod \"e40539b3-c74d-45b8-8526-d25a3a41c336\" (UID: \"e40539b3-c74d-45b8-8526-d25a3a41c336\") " Mar 19 12:09:52.679211 master-0 kubenswrapper[17644]: I0319 12:09:52.679070 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e40539b3-c74d-45b8-8526-d25a3a41c336-var-lock\") pod \"e40539b3-c74d-45b8-8526-d25a3a41c336\" (UID: \"e40539b3-c74d-45b8-8526-d25a3a41c336\") " Mar 19 12:09:52.679485 master-0 kubenswrapper[17644]: I0319 12:09:52.679450 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e40539b3-c74d-45b8-8526-d25a3a41c336-var-lock" (OuterVolumeSpecName: "var-lock") pod "e40539b3-c74d-45b8-8526-d25a3a41c336" (UID: "e40539b3-c74d-45b8-8526-d25a3a41c336"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:09:52.679524 master-0 kubenswrapper[17644]: I0319 12:09:52.679496 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e40539b3-c74d-45b8-8526-d25a3a41c336-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e40539b3-c74d-45b8-8526-d25a3a41c336" (UID: "e40539b3-c74d-45b8-8526-d25a3a41c336"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:09:52.682112 master-0 kubenswrapper[17644]: I0319 12:09:52.682056 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e40539b3-c74d-45b8-8526-d25a3a41c336-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e40539b3-c74d-45b8-8526-d25a3a41c336" (UID: "e40539b3-c74d-45b8-8526-d25a3a41c336"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:09:52.780844 master-0 kubenswrapper[17644]: I0319 12:09:52.780785 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e40539b3-c74d-45b8-8526-d25a3a41c336-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:52.780844 master-0 kubenswrapper[17644]: I0319 12:09:52.780826 17644 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e40539b3-c74d-45b8-8526-d25a3a41c336-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:52.780844 master-0 kubenswrapper[17644]: I0319 12:09:52.780836 17644 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e40539b3-c74d-45b8-8526-d25a3a41c336-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:53.226656 master-0 kubenswrapper[17644]: I0319 12:09:53.226586 17644 generic.go:334] "Generic (PLEG): container finished" podID="47d6a091-6854-4e44-8e7c-b2089cae286c" containerID="10280c512d50c98469a1460825a080495daf0b68a956dd42c1acbb90ff4776d5" exitCode=0 Mar 19 12:09:53.226938 master-0 kubenswrapper[17644]: I0319 12:09:53.226683 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"47d6a091-6854-4e44-8e7c-b2089cae286c","Type":"ContainerDied","Data":"10280c512d50c98469a1460825a080495daf0b68a956dd42c1acbb90ff4776d5"} Mar 19 12:09:53.228880 master-0 kubenswrapper[17644]: I0319 12:09:53.228846 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"e40539b3-c74d-45b8-8526-d25a3a41c336","Type":"ContainerDied","Data":"8e673fdc2a73d34bceac27f1f86506af398f6d45a54b2a2728987ee2b3c14168"} Mar 19 12:09:53.228880 master-0 kubenswrapper[17644]: I0319 12:09:53.228868 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 12:09:53.228967 master-0 kubenswrapper[17644]: I0319 12:09:53.228880 17644 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e673fdc2a73d34bceac27f1f86506af398f6d45a54b2a2728987ee2b3c14168" Mar 19 12:09:53.231934 master-0 kubenswrapper[17644]: I0319 12:09:53.231892 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_607a35c2a34325129014a178207e606c/kube-controller-manager-cert-syncer/0.log" Mar 19 12:09:53.232098 master-0 kubenswrapper[17644]: I0319 12:09:53.232017 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:09:53.244277 master-0 kubenswrapper[17644]: I0319 12:09:53.244227 17644 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="607a35c2a34325129014a178207e606c" podUID="cd89f86c9be90c18d6ac0ac77e416132" Mar 19 12:09:54.537459 master-0 kubenswrapper[17644]: I0319 12:09:54.537401 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 12:09:54.606058 master-0 kubenswrapper[17644]: I0319 12:09:54.606004 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47d6a091-6854-4e44-8e7c-b2089cae286c-kube-api-access\") pod \"47d6a091-6854-4e44-8e7c-b2089cae286c\" (UID: \"47d6a091-6854-4e44-8e7c-b2089cae286c\") " Mar 19 12:09:54.606278 master-0 kubenswrapper[17644]: I0319 12:09:54.606106 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47d6a091-6854-4e44-8e7c-b2089cae286c-kubelet-dir\") pod \"47d6a091-6854-4e44-8e7c-b2089cae286c\" (UID: \"47d6a091-6854-4e44-8e7c-b2089cae286c\") " Mar 19 12:09:54.606278 master-0 kubenswrapper[17644]: I0319 12:09:54.606168 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/47d6a091-6854-4e44-8e7c-b2089cae286c-var-lock\") pod \"47d6a091-6854-4e44-8e7c-b2089cae286c\" (UID: \"47d6a091-6854-4e44-8e7c-b2089cae286c\") " Mar 19 12:09:54.606278 master-0 kubenswrapper[17644]: I0319 12:09:54.606214 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47d6a091-6854-4e44-8e7c-b2089cae286c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "47d6a091-6854-4e44-8e7c-b2089cae286c" (UID: "47d6a091-6854-4e44-8e7c-b2089cae286c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:09:54.606374 master-0 kubenswrapper[17644]: I0319 12:09:54.606307 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47d6a091-6854-4e44-8e7c-b2089cae286c-var-lock" (OuterVolumeSpecName: "var-lock") pod "47d6a091-6854-4e44-8e7c-b2089cae286c" (UID: "47d6a091-6854-4e44-8e7c-b2089cae286c"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:09:54.606488 master-0 kubenswrapper[17644]: I0319 12:09:54.606459 17644 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47d6a091-6854-4e44-8e7c-b2089cae286c-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:54.606488 master-0 kubenswrapper[17644]: I0319 12:09:54.606482 17644 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/47d6a091-6854-4e44-8e7c-b2089cae286c-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:54.642314 master-0 kubenswrapper[17644]: I0319 12:09:54.642256 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47d6a091-6854-4e44-8e7c-b2089cae286c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "47d6a091-6854-4e44-8e7c-b2089cae286c" (UID: "47d6a091-6854-4e44-8e7c-b2089cae286c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:09:54.708361 master-0 kubenswrapper[17644]: I0319 12:09:54.708269 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47d6a091-6854-4e44-8e7c-b2089cae286c-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:55.250679 master-0 kubenswrapper[17644]: I0319 12:09:55.250627 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"47d6a091-6854-4e44-8e7c-b2089cae286c","Type":"ContainerDied","Data":"c69e0702250c60cdbd0576be35b123739c887d356444f6df4e557577b9a12928"} Mar 19 12:09:55.250679 master-0 kubenswrapper[17644]: I0319 12:09:55.250673 17644 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c69e0702250c60cdbd0576be35b123739c887d356444f6df4e557577b9a12928" Mar 19 12:09:55.250986 master-0 kubenswrapper[17644]: I0319 12:09:55.250689 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 12:09:55.484196 master-0 kubenswrapper[17644]: I0319 12:09:55.484135 17644 scope.go:117] "RemoveContainer" containerID="929340400aba9fe4e6bdc1b44a88bcc955ba57855b3d7ae839aa6c55c2194e53" Mar 19 12:09:56.274791 master-0 kubenswrapper[17644]: I0319 12:09:56.274707 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-764k4_d625c81e-01cc-424a-997d-546a5204a72b/snapshot-controller/4.log" Mar 19 12:09:56.275693 master-0 kubenswrapper[17644]: I0319 12:09:56.274857 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-764k4" event={"ID":"d625c81e-01cc-424a-997d-546a5204a72b","Type":"ContainerStarted","Data":"496fd130e85065aa597bde61afdf890114c3f6997fcfc8cd06bb41d26d45b04f"} Mar 19 12:09:56.874710 master-0 kubenswrapper[17644]: I0319 12:09:56.874576 17644 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 12:09:56.874938 master-0 kubenswrapper[17644]: E0319 12:09:56.874865 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d6a091-6854-4e44-8e7c-b2089cae286c" containerName="installer" Mar 19 12:09:56.874938 master-0 kubenswrapper[17644]: I0319 12:09:56.874877 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d6a091-6854-4e44-8e7c-b2089cae286c" containerName="installer" Mar 19 12:09:56.874938 master-0 kubenswrapper[17644]: E0319 12:09:56.874898 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e40539b3-c74d-45b8-8526-d25a3a41c336" containerName="installer" Mar 19 12:09:56.874938 master-0 kubenswrapper[17644]: I0319 12:09:56.874904 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="e40539b3-c74d-45b8-8526-d25a3a41c336" containerName="installer" Mar 19 12:09:56.875097 master-0 kubenswrapper[17644]: I0319 12:09:56.875015 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="e40539b3-c74d-45b8-8526-d25a3a41c336" containerName="installer" Mar 19 12:09:56.875097 master-0 kubenswrapper[17644]: I0319 12:09:56.875040 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="47d6a091-6854-4e44-8e7c-b2089cae286c" containerName="installer" Mar 19 12:09:56.875487 master-0 kubenswrapper[17644]: I0319 12:09:56.875461 17644 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 12:09:56.875615 master-0 kubenswrapper[17644]: I0319 12:09:56.875581 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:56.875678 master-0 kubenswrapper[17644]: I0319 12:09:56.875661 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="d5f502b117c7c8479f7f20848a50fec0" containerName="kube-apiserver" containerID="cri-o://35db4f3a681d398c15eb2bb62e101d7bd88e8f0abee04fde1dc85180feca1660" gracePeriod=15 Mar 19 12:09:56.875803 master-0 kubenswrapper[17644]: I0319 12:09:56.875739 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="d5f502b117c7c8479f7f20848a50fec0" containerName="kube-apiserver-check-endpoints" containerID="cri-o://3f44dc03d98a097335438c4135778a6b234239f0ce338e05a88d3e7f3ae87ee9" gracePeriod=15 Mar 19 12:09:56.875803 master-0 kubenswrapper[17644]: I0319 12:09:56.875785 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="d5f502b117c7c8479f7f20848a50fec0" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://8c6ec577468afadacead1caed5bc3d3288574f627159f0d3c6f0d8a6df5a21b8" gracePeriod=15 Mar 19 12:09:56.875939 master-0 kubenswrapper[17644]: I0319 12:09:56.875818 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="d5f502b117c7c8479f7f20848a50fec0" containerName="kube-apiserver-cert-syncer" containerID="cri-o://6d4b45c9a72c62e662a728a34abec71c261e7bd7fa4d79a2788c04639c984ee9" gracePeriod=15 Mar 19 12:09:56.875939 master-0 kubenswrapper[17644]: I0319 12:09:56.875775 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="d5f502b117c7c8479f7f20848a50fec0" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://ec7970ff26d79ac8592563e7fdd11f0fbccfd7a30338c99bafb57d01a40d0931" gracePeriod=15 Mar 19 12:09:56.943403 master-0 kubenswrapper[17644]: I0319 12:09:56.943354 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:56.943494 master-0 kubenswrapper[17644]: I0319 12:09:56.943423 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:56.943602 master-0 kubenswrapper[17644]: I0319 12:09:56.943567 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:56.943641 master-0 kubenswrapper[17644]: I0319 12:09:56.943623 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:56.943675 master-0 kubenswrapper[17644]: I0319 12:09:56.943648 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:56.945219 master-0 kubenswrapper[17644]: I0319 12:09:56.945186 17644 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 12:09:56.945457 master-0 kubenswrapper[17644]: E0319 12:09:56.945432 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f502b117c7c8479f7f20848a50fec0" containerName="kube-apiserver-cert-syncer" Mar 19 12:09:56.945457 master-0 kubenswrapper[17644]: I0319 12:09:56.945451 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f502b117c7c8479f7f20848a50fec0" containerName="kube-apiserver-cert-syncer" Mar 19 12:09:56.945525 master-0 kubenswrapper[17644]: E0319 12:09:56.945480 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f502b117c7c8479f7f20848a50fec0" containerName="setup" Mar 19 12:09:56.945525 master-0 kubenswrapper[17644]: I0319 12:09:56.945486 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f502b117c7c8479f7f20848a50fec0" containerName="setup" Mar 19 12:09:56.945525 master-0 kubenswrapper[17644]: E0319 12:09:56.945495 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f502b117c7c8479f7f20848a50fec0" containerName="kube-apiserver-check-endpoints" Mar 19 12:09:56.945525 master-0 kubenswrapper[17644]: I0319 12:09:56.945501 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f502b117c7c8479f7f20848a50fec0" containerName="kube-apiserver-check-endpoints" Mar 19 12:09:56.945525 master-0 kubenswrapper[17644]: E0319 12:09:56.945512 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f502b117c7c8479f7f20848a50fec0" containerName="kube-apiserver-insecure-readyz" Mar 19 12:09:56.945525 master-0 kubenswrapper[17644]: I0319 12:09:56.945518 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f502b117c7c8479f7f20848a50fec0" containerName="kube-apiserver-insecure-readyz" Mar 19 12:09:56.945685 master-0 kubenswrapper[17644]: E0319 12:09:56.945538 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f502b117c7c8479f7f20848a50fec0" containerName="kube-apiserver" Mar 19 12:09:56.945685 master-0 kubenswrapper[17644]: I0319 12:09:56.945545 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f502b117c7c8479f7f20848a50fec0" containerName="kube-apiserver" Mar 19 12:09:56.945685 master-0 kubenswrapper[17644]: E0319 12:09:56.945555 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f502b117c7c8479f7f20848a50fec0" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 12:09:56.945685 master-0 kubenswrapper[17644]: I0319 12:09:56.945561 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f502b117c7c8479f7f20848a50fec0" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 12:09:56.945818 master-0 kubenswrapper[17644]: I0319 12:09:56.945696 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5f502b117c7c8479f7f20848a50fec0" containerName="kube-apiserver-insecure-readyz" Mar 19 12:09:56.945818 master-0 kubenswrapper[17644]: I0319 12:09:56.945710 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5f502b117c7c8479f7f20848a50fec0" containerName="kube-apiserver-cert-syncer" Mar 19 12:09:56.945818 master-0 kubenswrapper[17644]: I0319 12:09:56.945717 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5f502b117c7c8479f7f20848a50fec0" containerName="kube-apiserver" Mar 19 12:09:56.945818 master-0 kubenswrapper[17644]: I0319 12:09:56.945743 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5f502b117c7c8479f7f20848a50fec0" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 12:09:56.945818 master-0 kubenswrapper[17644]: I0319 12:09:56.945757 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5f502b117c7c8479f7f20848a50fec0" containerName="kube-apiserver-check-endpoints" Mar 19 12:09:57.045096 master-0 kubenswrapper[17644]: I0319 12:09:57.045026 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:57.045096 master-0 kubenswrapper[17644]: I0319 12:09:57.045082 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3cae843f2a8e3c3c3212b1177305c1d5-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"3cae843f2a8e3c3c3212b1177305c1d5\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:09:57.045322 master-0 kubenswrapper[17644]: I0319 12:09:57.045153 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:57.045322 master-0 kubenswrapper[17644]: I0319 12:09:57.045221 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:57.045322 master-0 kubenswrapper[17644]: I0319 12:09:57.045256 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:57.045416 master-0 kubenswrapper[17644]: I0319 12:09:57.045326 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:57.045416 master-0 kubenswrapper[17644]: I0319 12:09:57.045404 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3cae843f2a8e3c3c3212b1177305c1d5-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"3cae843f2a8e3c3c3212b1177305c1d5\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:09:57.045522 master-0 kubenswrapper[17644]: I0319 12:09:57.045496 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3cae843f2a8e3c3c3212b1177305c1d5-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"3cae843f2a8e3c3c3212b1177305c1d5\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:09:57.045556 master-0 kubenswrapper[17644]: I0319 12:09:57.045539 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:57.045654 master-0 kubenswrapper[17644]: I0319 12:09:57.045630 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:57.045686 master-0 kubenswrapper[17644]: I0319 12:09:57.045666 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:57.045718 master-0 kubenswrapper[17644]: I0319 12:09:57.045692 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:57.045718 master-0 kubenswrapper[17644]: I0319 12:09:57.045713 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:57.147410 master-0 kubenswrapper[17644]: I0319 12:09:57.147255 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3cae843f2a8e3c3c3212b1177305c1d5-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"3cae843f2a8e3c3c3212b1177305c1d5\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:09:57.147410 master-0 kubenswrapper[17644]: I0319 12:09:57.147338 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3cae843f2a8e3c3c3212b1177305c1d5-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"3cae843f2a8e3c3c3212b1177305c1d5\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:09:57.147410 master-0 kubenswrapper[17644]: I0319 12:09:57.147384 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3cae843f2a8e3c3c3212b1177305c1d5-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"3cae843f2a8e3c3c3212b1177305c1d5\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:09:57.147786 master-0 kubenswrapper[17644]: I0319 12:09:57.147423 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3cae843f2a8e3c3c3212b1177305c1d5-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"3cae843f2a8e3c3c3212b1177305c1d5\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:09:57.147786 master-0 kubenswrapper[17644]: I0319 12:09:57.147448 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3cae843f2a8e3c3c3212b1177305c1d5-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"3cae843f2a8e3c3c3212b1177305c1d5\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:09:57.147786 master-0 kubenswrapper[17644]: I0319 12:09:57.147384 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3cae843f2a8e3c3c3212b1177305c1d5-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"3cae843f2a8e3c3c3212b1177305c1d5\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:09:57.285499 master-0 kubenswrapper[17644]: I0319 12:09:57.285441 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_d5f502b117c7c8479f7f20848a50fec0/kube-apiserver-cert-syncer/0.log" Mar 19 12:09:57.286798 master-0 kubenswrapper[17644]: I0319 12:09:57.286686 17644 generic.go:334] "Generic (PLEG): container finished" podID="d5f502b117c7c8479f7f20848a50fec0" containerID="3f44dc03d98a097335438c4135778a6b234239f0ce338e05a88d3e7f3ae87ee9" exitCode=0 Mar 19 12:09:57.286899 master-0 kubenswrapper[17644]: I0319 12:09:57.286794 17644 generic.go:334] "Generic (PLEG): container finished" podID="d5f502b117c7c8479f7f20848a50fec0" containerID="ec7970ff26d79ac8592563e7fdd11f0fbccfd7a30338c99bafb57d01a40d0931" exitCode=0 Mar 19 12:09:57.286899 master-0 kubenswrapper[17644]: I0319 12:09:57.286818 17644 generic.go:334] "Generic (PLEG): container finished" podID="d5f502b117c7c8479f7f20848a50fec0" containerID="8c6ec577468afadacead1caed5bc3d3288574f627159f0d3c6f0d8a6df5a21b8" exitCode=0 Mar 19 12:09:57.286899 master-0 kubenswrapper[17644]: I0319 12:09:57.286847 17644 generic.go:334] "Generic (PLEG): container finished" podID="d5f502b117c7c8479f7f20848a50fec0" containerID="6d4b45c9a72c62e662a728a34abec71c261e7bd7fa4d79a2788c04639c984ee9" exitCode=2 Mar 19 12:10:01.483701 master-0 kubenswrapper[17644]: I0319 12:10:01.483629 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:10:01.502168 master-0 kubenswrapper[17644]: I0319 12:10:01.502109 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="aacc6d86-5946-4308-8e3c-f5dc6bfc7a26" Mar 19 12:10:01.502168 master-0 kubenswrapper[17644]: I0319 12:10:01.502150 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="aacc6d86-5946-4308-8e3c-f5dc6bfc7a26" Mar 19 12:10:01.503017 master-0 kubenswrapper[17644]: E0319 12:10:01.502962 17644 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:10:01.503496 master-0 kubenswrapper[17644]: I0319 12:10:01.503463 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:10:01.529894 master-0 kubenswrapper[17644]: W0319 12:10:01.529811 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8413125cf444e5c95f023c5dd9c6151e.slice/crio-2a60db6fdec87aec4d267a61ffe85180254db28581749512cdaedfe272dce993 WatchSource:0}: Error finding container 2a60db6fdec87aec4d267a61ffe85180254db28581749512cdaedfe272dce993: Status 404 returned error can't find the container with id 2a60db6fdec87aec4d267a61ffe85180254db28581749512cdaedfe272dce993 Mar 19 12:10:01.532798 master-0 kubenswrapper[17644]: E0319 12:10:01.532544 17644 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{openshift-kube-scheduler-master-0.189e3cd7f976f07c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-master-0,UID:8413125cf444e5c95f023c5dd9c6151e,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 12:10:01.531510908 +0000 UTC m=+635.301468943,LastTimestamp:2026-03-19 12:10:01.531510908 +0000 UTC m=+635.301468943,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 12:10:01.925601 master-0 kubenswrapper[17644]: E0319 12:10:01.925528 17644 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:10:01.926251 master-0 kubenswrapper[17644]: I0319 12:10:01.926213 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:10:02.320646 master-0 kubenswrapper[17644]: I0319 12:10:02.320554 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"7a4744531cb137d7252790be662d8cc8","Type":"ContainerStarted","Data":"d5372bed019a717174b80a86dd5805bf70d1c5da596cea00bb090347d2bd1745"} Mar 19 12:10:02.320646 master-0 kubenswrapper[17644]: I0319 12:10:02.320609 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"7a4744531cb137d7252790be662d8cc8","Type":"ContainerStarted","Data":"796edf3b9fb91f1615563a7ce0b0b6311ad58b790fb35c994bf7c8b2141f0e4d"} Mar 19 12:10:02.321651 master-0 kubenswrapper[17644]: E0319 12:10:02.321559 17644 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:10:02.321943 master-0 kubenswrapper[17644]: I0319 12:10:02.321910 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerStarted","Data":"8c318dcaa73dc8cbe4b4aad8b140d9a7a1894465aa62d9fd6977068c310f6aa4"} Mar 19 12:10:02.321943 master-0 kubenswrapper[17644]: I0319 12:10:02.321941 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerStarted","Data":"2a60db6fdec87aec4d267a61ffe85180254db28581749512cdaedfe272dce993"} Mar 19 12:10:02.322161 master-0 kubenswrapper[17644]: I0319 12:10:02.322132 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="aacc6d86-5946-4308-8e3c-f5dc6bfc7a26" Mar 19 12:10:02.322161 master-0 kubenswrapper[17644]: I0319 12:10:02.322152 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="aacc6d86-5946-4308-8e3c-f5dc6bfc7a26" Mar 19 12:10:02.322609 master-0 kubenswrapper[17644]: E0319 12:10:02.322568 17644 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:10:02.323598 master-0 kubenswrapper[17644]: I0319 12:10:02.323559 17644 generic.go:334] "Generic (PLEG): container finished" podID="7def3099-f487-44d4-a1d5-2ae096ef8804" containerID="494afb441050ae61d98bfcdbb49df2010d70e4c70da80aefeef6526c1b9b02d2" exitCode=0 Mar 19 12:10:02.323598 master-0 kubenswrapper[17644]: I0319 12:10:02.323594 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-7-master-0" event={"ID":"7def3099-f487-44d4-a1d5-2ae096ef8804","Type":"ContainerDied","Data":"494afb441050ae61d98bfcdbb49df2010d70e4c70da80aefeef6526c1b9b02d2"} Mar 19 12:10:02.324166 master-0 kubenswrapper[17644]: I0319 12:10:02.324132 17644 status_manager.go:851] "Failed to get status for pod" podUID="7def3099-f487-44d4-a1d5-2ae096ef8804" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:02.737637 master-0 kubenswrapper[17644]: I0319 12:10:02.737572 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_d5f502b117c7c8479f7f20848a50fec0/kube-apiserver-cert-syncer/0.log" Mar 19 12:10:02.738368 master-0 kubenswrapper[17644]: I0319 12:10:02.738329 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:02.739420 master-0 kubenswrapper[17644]: I0319 12:10:02.739355 17644 status_manager.go:851] "Failed to get status for pod" podUID="7def3099-f487-44d4-a1d5-2ae096ef8804" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:02.740067 master-0 kubenswrapper[17644]: I0319 12:10:02.740020 17644 status_manager.go:851] "Failed to get status for pod" podUID="d5f502b117c7c8479f7f20848a50fec0" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:02.837861 master-0 kubenswrapper[17644]: I0319 12:10:02.837821 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-resource-dir\") pod \"d5f502b117c7c8479f7f20848a50fec0\" (UID: \"d5f502b117c7c8479f7f20848a50fec0\") " Mar 19 12:10:02.838081 master-0 kubenswrapper[17644]: I0319 12:10:02.837881 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-audit-dir\") pod \"d5f502b117c7c8479f7f20848a50fec0\" (UID: \"d5f502b117c7c8479f7f20848a50fec0\") " Mar 19 12:10:02.838081 master-0 kubenswrapper[17644]: I0319 12:10:02.837940 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-cert-dir\") pod \"d5f502b117c7c8479f7f20848a50fec0\" (UID: \"d5f502b117c7c8479f7f20848a50fec0\") " Mar 19 12:10:02.838081 master-0 kubenswrapper[17644]: I0319 12:10:02.837967 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "d5f502b117c7c8479f7f20848a50fec0" (UID: "d5f502b117c7c8479f7f20848a50fec0"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:10:02.838081 master-0 kubenswrapper[17644]: I0319 12:10:02.837998 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "d5f502b117c7c8479f7f20848a50fec0" (UID: "d5f502b117c7c8479f7f20848a50fec0"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:10:02.838081 master-0 kubenswrapper[17644]: I0319 12:10:02.838069 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "d5f502b117c7c8479f7f20848a50fec0" (UID: "d5f502b117c7c8479f7f20848a50fec0"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:10:02.838251 master-0 kubenswrapper[17644]: I0319 12:10:02.838174 17644 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:10:02.838251 master-0 kubenswrapper[17644]: I0319 12:10:02.838186 17644 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:10:02.838251 master-0 kubenswrapper[17644]: I0319 12:10:02.838193 17644 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:10:03.009038 master-0 kubenswrapper[17644]: E0319 12:10:03.008962 17644 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:03.010024 master-0 kubenswrapper[17644]: E0319 12:10:03.009900 17644 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:03.011105 master-0 kubenswrapper[17644]: E0319 12:10:03.010993 17644 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:03.011739 master-0 kubenswrapper[17644]: E0319 12:10:03.011686 17644 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:03.012304 master-0 kubenswrapper[17644]: E0319 12:10:03.012214 17644 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:03.012304 master-0 kubenswrapper[17644]: I0319 12:10:03.012255 17644 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 12:10:03.012805 master-0 kubenswrapper[17644]: E0319 12:10:03.012750 17644 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 19 12:10:03.214865 master-0 kubenswrapper[17644]: E0319 12:10:03.214788 17644 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 19 12:10:03.333683 master-0 kubenswrapper[17644]: I0319 12:10:03.333568 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_d5f502b117c7c8479f7f20848a50fec0/kube-apiserver-cert-syncer/0.log" Mar 19 12:10:03.334671 master-0 kubenswrapper[17644]: I0319 12:10:03.334633 17644 generic.go:334] "Generic (PLEG): container finished" podID="d5f502b117c7c8479f7f20848a50fec0" containerID="35db4f3a681d398c15eb2bb62e101d7bd88e8f0abee04fde1dc85180feca1660" exitCode=0 Mar 19 12:10:03.334800 master-0 kubenswrapper[17644]: I0319 12:10:03.334751 17644 scope.go:117] "RemoveContainer" containerID="3f44dc03d98a097335438c4135778a6b234239f0ce338e05a88d3e7f3ae87ee9" Mar 19 12:10:03.334869 master-0 kubenswrapper[17644]: I0319 12:10:03.334799 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:03.354495 master-0 kubenswrapper[17644]: I0319 12:10:03.354441 17644 status_manager.go:851] "Failed to get status for pod" podUID="d5f502b117c7c8479f7f20848a50fec0" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:03.355108 master-0 kubenswrapper[17644]: I0319 12:10:03.354957 17644 status_manager.go:851] "Failed to get status for pod" podUID="7def3099-f487-44d4-a1d5-2ae096ef8804" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:03.357121 master-0 kubenswrapper[17644]: I0319 12:10:03.357092 17644 scope.go:117] "RemoveContainer" containerID="ec7970ff26d79ac8592563e7fdd11f0fbccfd7a30338c99bafb57d01a40d0931" Mar 19 12:10:03.374296 master-0 kubenswrapper[17644]: I0319 12:10:03.374152 17644 scope.go:117] "RemoveContainer" containerID="8c6ec577468afadacead1caed5bc3d3288574f627159f0d3c6f0d8a6df5a21b8" Mar 19 12:10:03.392454 master-0 kubenswrapper[17644]: I0319 12:10:03.392403 17644 scope.go:117] "RemoveContainer" containerID="6d4b45c9a72c62e662a728a34abec71c261e7bd7fa4d79a2788c04639c984ee9" Mar 19 12:10:03.408959 master-0 kubenswrapper[17644]: I0319 12:10:03.408825 17644 scope.go:117] "RemoveContainer" containerID="35db4f3a681d398c15eb2bb62e101d7bd88e8f0abee04fde1dc85180feca1660" Mar 19 12:10:03.448826 master-0 kubenswrapper[17644]: I0319 12:10:03.448778 17644 scope.go:117] "RemoveContainer" containerID="07876832b075d68ba6beb4dc411493e861290319d0cf01c3458b0899ea761f52" Mar 19 12:10:03.467031 master-0 kubenswrapper[17644]: I0319 12:10:03.466999 17644 scope.go:117] "RemoveContainer" containerID="3f44dc03d98a097335438c4135778a6b234239f0ce338e05a88d3e7f3ae87ee9" Mar 19 12:10:03.467650 master-0 kubenswrapper[17644]: E0319 12:10:03.467602 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f44dc03d98a097335438c4135778a6b234239f0ce338e05a88d3e7f3ae87ee9\": container with ID starting with 3f44dc03d98a097335438c4135778a6b234239f0ce338e05a88d3e7f3ae87ee9 not found: ID does not exist" containerID="3f44dc03d98a097335438c4135778a6b234239f0ce338e05a88d3e7f3ae87ee9" Mar 19 12:10:03.467772 master-0 kubenswrapper[17644]: I0319 12:10:03.467659 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f44dc03d98a097335438c4135778a6b234239f0ce338e05a88d3e7f3ae87ee9"} err="failed to get container status \"3f44dc03d98a097335438c4135778a6b234239f0ce338e05a88d3e7f3ae87ee9\": rpc error: code = NotFound desc = could not find container \"3f44dc03d98a097335438c4135778a6b234239f0ce338e05a88d3e7f3ae87ee9\": container with ID starting with 3f44dc03d98a097335438c4135778a6b234239f0ce338e05a88d3e7f3ae87ee9 not found: ID does not exist" Mar 19 12:10:03.467772 master-0 kubenswrapper[17644]: I0319 12:10:03.467694 17644 scope.go:117] "RemoveContainer" containerID="ec7970ff26d79ac8592563e7fdd11f0fbccfd7a30338c99bafb57d01a40d0931" Mar 19 12:10:03.468170 master-0 kubenswrapper[17644]: E0319 12:10:03.468124 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec7970ff26d79ac8592563e7fdd11f0fbccfd7a30338c99bafb57d01a40d0931\": container with ID starting with ec7970ff26d79ac8592563e7fdd11f0fbccfd7a30338c99bafb57d01a40d0931 not found: ID does not exist" containerID="ec7970ff26d79ac8592563e7fdd11f0fbccfd7a30338c99bafb57d01a40d0931" Mar 19 12:10:03.468228 master-0 kubenswrapper[17644]: I0319 12:10:03.468173 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec7970ff26d79ac8592563e7fdd11f0fbccfd7a30338c99bafb57d01a40d0931"} err="failed to get container status \"ec7970ff26d79ac8592563e7fdd11f0fbccfd7a30338c99bafb57d01a40d0931\": rpc error: code = NotFound desc = could not find container \"ec7970ff26d79ac8592563e7fdd11f0fbccfd7a30338c99bafb57d01a40d0931\": container with ID starting with ec7970ff26d79ac8592563e7fdd11f0fbccfd7a30338c99bafb57d01a40d0931 not found: ID does not exist" Mar 19 12:10:03.468228 master-0 kubenswrapper[17644]: I0319 12:10:03.468198 17644 scope.go:117] "RemoveContainer" containerID="8c6ec577468afadacead1caed5bc3d3288574f627159f0d3c6f0d8a6df5a21b8" Mar 19 12:10:03.468502 master-0 kubenswrapper[17644]: E0319 12:10:03.468473 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c6ec577468afadacead1caed5bc3d3288574f627159f0d3c6f0d8a6df5a21b8\": container with ID starting with 8c6ec577468afadacead1caed5bc3d3288574f627159f0d3c6f0d8a6df5a21b8 not found: ID does not exist" containerID="8c6ec577468afadacead1caed5bc3d3288574f627159f0d3c6f0d8a6df5a21b8" Mar 19 12:10:03.468502 master-0 kubenswrapper[17644]: I0319 12:10:03.468496 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6ec577468afadacead1caed5bc3d3288574f627159f0d3c6f0d8a6df5a21b8"} err="failed to get container status \"8c6ec577468afadacead1caed5bc3d3288574f627159f0d3c6f0d8a6df5a21b8\": rpc error: code = NotFound desc = could not find container \"8c6ec577468afadacead1caed5bc3d3288574f627159f0d3c6f0d8a6df5a21b8\": container with ID starting with 8c6ec577468afadacead1caed5bc3d3288574f627159f0d3c6f0d8a6df5a21b8 not found: ID does not exist" Mar 19 12:10:03.468576 master-0 kubenswrapper[17644]: I0319 12:10:03.468511 17644 scope.go:117] "RemoveContainer" containerID="6d4b45c9a72c62e662a728a34abec71c261e7bd7fa4d79a2788c04639c984ee9" Mar 19 12:10:03.469100 master-0 kubenswrapper[17644]: E0319 12:10:03.468756 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d4b45c9a72c62e662a728a34abec71c261e7bd7fa4d79a2788c04639c984ee9\": container with ID starting with 6d4b45c9a72c62e662a728a34abec71c261e7bd7fa4d79a2788c04639c984ee9 not found: ID does not exist" containerID="6d4b45c9a72c62e662a728a34abec71c261e7bd7fa4d79a2788c04639c984ee9" Mar 19 12:10:03.469100 master-0 kubenswrapper[17644]: I0319 12:10:03.468790 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d4b45c9a72c62e662a728a34abec71c261e7bd7fa4d79a2788c04639c984ee9"} err="failed to get container status \"6d4b45c9a72c62e662a728a34abec71c261e7bd7fa4d79a2788c04639c984ee9\": rpc error: code = NotFound desc = could not find container \"6d4b45c9a72c62e662a728a34abec71c261e7bd7fa4d79a2788c04639c984ee9\": container with ID starting with 6d4b45c9a72c62e662a728a34abec71c261e7bd7fa4d79a2788c04639c984ee9 not found: ID does not exist" Mar 19 12:10:03.469100 master-0 kubenswrapper[17644]: I0319 12:10:03.468808 17644 scope.go:117] "RemoveContainer" containerID="35db4f3a681d398c15eb2bb62e101d7bd88e8f0abee04fde1dc85180feca1660" Mar 19 12:10:03.469100 master-0 kubenswrapper[17644]: E0319 12:10:03.469031 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35db4f3a681d398c15eb2bb62e101d7bd88e8f0abee04fde1dc85180feca1660\": container with ID starting with 35db4f3a681d398c15eb2bb62e101d7bd88e8f0abee04fde1dc85180feca1660 not found: ID does not exist" containerID="35db4f3a681d398c15eb2bb62e101d7bd88e8f0abee04fde1dc85180feca1660" Mar 19 12:10:03.469254 master-0 kubenswrapper[17644]: I0319 12:10:03.469055 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35db4f3a681d398c15eb2bb62e101d7bd88e8f0abee04fde1dc85180feca1660"} err="failed to get container status \"35db4f3a681d398c15eb2bb62e101d7bd88e8f0abee04fde1dc85180feca1660\": rpc error: code = NotFound desc = could not find container \"35db4f3a681d398c15eb2bb62e101d7bd88e8f0abee04fde1dc85180feca1660\": container with ID starting with 35db4f3a681d398c15eb2bb62e101d7bd88e8f0abee04fde1dc85180feca1660 not found: ID does not exist" Mar 19 12:10:03.469254 master-0 kubenswrapper[17644]: I0319 12:10:03.469159 17644 scope.go:117] "RemoveContainer" containerID="07876832b075d68ba6beb4dc411493e861290319d0cf01c3458b0899ea761f52" Mar 19 12:10:03.469471 master-0 kubenswrapper[17644]: E0319 12:10:03.469438 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07876832b075d68ba6beb4dc411493e861290319d0cf01c3458b0899ea761f52\": container with ID starting with 07876832b075d68ba6beb4dc411493e861290319d0cf01c3458b0899ea761f52 not found: ID does not exist" containerID="07876832b075d68ba6beb4dc411493e861290319d0cf01c3458b0899ea761f52" Mar 19 12:10:03.469509 master-0 kubenswrapper[17644]: I0319 12:10:03.469466 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07876832b075d68ba6beb4dc411493e861290319d0cf01c3458b0899ea761f52"} err="failed to get container status \"07876832b075d68ba6beb4dc411493e861290319d0cf01c3458b0899ea761f52\": rpc error: code = NotFound desc = could not find container \"07876832b075d68ba6beb4dc411493e861290319d0cf01c3458b0899ea761f52\": container with ID starting with 07876832b075d68ba6beb4dc411493e861290319d0cf01c3458b0899ea761f52 not found: ID does not exist" Mar 19 12:10:03.615895 master-0 kubenswrapper[17644]: E0319 12:10:03.615715 17644 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 19 12:10:03.690086 master-0 kubenswrapper[17644]: I0319 12:10:03.690043 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 12:10:03.691191 master-0 kubenswrapper[17644]: I0319 12:10:03.691105 17644 status_manager.go:851] "Failed to get status for pod" podUID="7def3099-f487-44d4-a1d5-2ae096ef8804" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:03.691824 master-0 kubenswrapper[17644]: I0319 12:10:03.691769 17644 status_manager.go:851] "Failed to get status for pod" podUID="d5f502b117c7c8479f7f20848a50fec0" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:03.749915 master-0 kubenswrapper[17644]: I0319 12:10:03.749831 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7def3099-f487-44d4-a1d5-2ae096ef8804-var-lock\") pod \"7def3099-f487-44d4-a1d5-2ae096ef8804\" (UID: \"7def3099-f487-44d4-a1d5-2ae096ef8804\") " Mar 19 12:10:03.750684 master-0 kubenswrapper[17644]: I0319 12:10:03.749891 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7def3099-f487-44d4-a1d5-2ae096ef8804-var-lock" (OuterVolumeSpecName: "var-lock") pod "7def3099-f487-44d4-a1d5-2ae096ef8804" (UID: "7def3099-f487-44d4-a1d5-2ae096ef8804"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:10:03.750684 master-0 kubenswrapper[17644]: I0319 12:10:03.749996 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7def3099-f487-44d4-a1d5-2ae096ef8804-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7def3099-f487-44d4-a1d5-2ae096ef8804" (UID: "7def3099-f487-44d4-a1d5-2ae096ef8804"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:10:03.750684 master-0 kubenswrapper[17644]: I0319 12:10:03.749935 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7def3099-f487-44d4-a1d5-2ae096ef8804-kubelet-dir\") pod \"7def3099-f487-44d4-a1d5-2ae096ef8804\" (UID: \"7def3099-f487-44d4-a1d5-2ae096ef8804\") " Mar 19 12:10:03.751697 master-0 kubenswrapper[17644]: I0319 12:10:03.750803 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7def3099-f487-44d4-a1d5-2ae096ef8804-kube-api-access\") pod \"7def3099-f487-44d4-a1d5-2ae096ef8804\" (UID: \"7def3099-f487-44d4-a1d5-2ae096ef8804\") " Mar 19 12:10:03.751697 master-0 kubenswrapper[17644]: I0319 12:10:03.751337 17644 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7def3099-f487-44d4-a1d5-2ae096ef8804-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:10:03.751697 master-0 kubenswrapper[17644]: I0319 12:10:03.751351 17644 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7def3099-f487-44d4-a1d5-2ae096ef8804-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:10:03.753999 master-0 kubenswrapper[17644]: I0319 12:10:03.753959 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7def3099-f487-44d4-a1d5-2ae096ef8804-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7def3099-f487-44d4-a1d5-2ae096ef8804" (UID: "7def3099-f487-44d4-a1d5-2ae096ef8804"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:10:03.853136 master-0 kubenswrapper[17644]: I0319 12:10:03.853078 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7def3099-f487-44d4-a1d5-2ae096ef8804-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 12:10:04.345264 master-0 kubenswrapper[17644]: I0319 12:10:04.345172 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-7-master-0" event={"ID":"7def3099-f487-44d4-a1d5-2ae096ef8804","Type":"ContainerDied","Data":"a175aaff4983d76d4df244070ffa7754983086ec35c53d253a59e2fc3c4007f9"} Mar 19 12:10:04.345264 master-0 kubenswrapper[17644]: I0319 12:10:04.345231 17644 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a175aaff4983d76d4df244070ffa7754983086ec35c53d253a59e2fc3c4007f9" Mar 19 12:10:04.345264 master-0 kubenswrapper[17644]: I0319 12:10:04.345235 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 12:10:04.363232 master-0 kubenswrapper[17644]: I0319 12:10:04.363182 17644 status_manager.go:851] "Failed to get status for pod" podUID="7def3099-f487-44d4-a1d5-2ae096ef8804" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:04.364174 master-0 kubenswrapper[17644]: I0319 12:10:04.364120 17644 status_manager.go:851] "Failed to get status for pod" podUID="d5f502b117c7c8479f7f20848a50fec0" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:04.417541 master-0 kubenswrapper[17644]: E0319 12:10:04.417460 17644 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 19 12:10:04.483130 master-0 kubenswrapper[17644]: I0319 12:10:04.483086 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:04.484951 master-0 kubenswrapper[17644]: I0319 12:10:04.484886 17644 status_manager.go:851] "Failed to get status for pod" podUID="7def3099-f487-44d4-a1d5-2ae096ef8804" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:04.485664 master-0 kubenswrapper[17644]: I0319 12:10:04.485613 17644 status_manager.go:851] "Failed to get status for pod" podUID="d5f502b117c7c8479f7f20848a50fec0" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:04.501448 master-0 kubenswrapper[17644]: I0319 12:10:04.501396 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5f502b117c7c8479f7f20848a50fec0" path="/var/lib/kubelet/pods/d5f502b117c7c8479f7f20848a50fec0/volumes" Mar 19 12:10:04.505909 master-0 kubenswrapper[17644]: I0319 12:10:04.505861 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7043072c-879c-4116-b7c7-d5301b9aac2c" Mar 19 12:10:04.505909 master-0 kubenswrapper[17644]: I0319 12:10:04.505905 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7043072c-879c-4116-b7c7-d5301b9aac2c" Mar 19 12:10:04.507186 master-0 kubenswrapper[17644]: E0319 12:10:04.506996 17644 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:04.508037 master-0 kubenswrapper[17644]: I0319 12:10:04.507990 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:04.532045 master-0 kubenswrapper[17644]: W0319 12:10:04.531989 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd89f86c9be90c18d6ac0ac77e416132.slice/crio-1c4034788b1ca525adcebd61ca3a8b39dbf42c58c493015536d115e1d16f6f79 WatchSource:0}: Error finding container 1c4034788b1ca525adcebd61ca3a8b39dbf42c58c493015536d115e1d16f6f79: Status 404 returned error can't find the container with id 1c4034788b1ca525adcebd61ca3a8b39dbf42c58c493015536d115e1d16f6f79 Mar 19 12:10:05.363851 master-0 kubenswrapper[17644]: I0319 12:10:05.363816 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"cd89f86c9be90c18d6ac0ac77e416132","Type":"ContainerStarted","Data":"47af189c6fe418407bee0b4ae3aebf4ce017e2f48e6f3ef69d49d3a65718b532"} Mar 19 12:10:05.364452 master-0 kubenswrapper[17644]: I0319 12:10:05.364419 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"cd89f86c9be90c18d6ac0ac77e416132","Type":"ContainerStarted","Data":"20a79946a7319a924555715588e94754d34a85951cf806f3329f8ba792634e3a"} Mar 19 12:10:05.364529 master-0 kubenswrapper[17644]: I0319 12:10:05.364517 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"cd89f86c9be90c18d6ac0ac77e416132","Type":"ContainerStarted","Data":"3426c7b28bdb8998d5a49a206f64d0b56bea556fcc685937e8e82a96ddffa422"} Mar 19 12:10:05.364592 master-0 kubenswrapper[17644]: I0319 12:10:05.364581 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"cd89f86c9be90c18d6ac0ac77e416132","Type":"ContainerStarted","Data":"1c4034788b1ca525adcebd61ca3a8b39dbf42c58c493015536d115e1d16f6f79"} Mar 19 12:10:06.018745 master-0 kubenswrapper[17644]: E0319 12:10:06.018635 17644 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 19 12:10:06.373910 master-0 kubenswrapper[17644]: I0319 12:10:06.373747 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"cd89f86c9be90c18d6ac0ac77e416132","Type":"ContainerStarted","Data":"4dff33a5ce553ee69697337a00bef65eb4244e0d4f28d8c47661d7648d7c90a4"} Mar 19 12:10:06.374505 master-0 kubenswrapper[17644]: I0319 12:10:06.374034 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7043072c-879c-4116-b7c7-d5301b9aac2c" Mar 19 12:10:06.374505 master-0 kubenswrapper[17644]: I0319 12:10:06.374054 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7043072c-879c-4116-b7c7-d5301b9aac2c" Mar 19 12:10:06.374798 master-0 kubenswrapper[17644]: E0319 12:10:06.374760 17644 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:06.375105 master-0 kubenswrapper[17644]: I0319 12:10:06.375049 17644 status_manager.go:851] "Failed to get status for pod" podUID="7def3099-f487-44d4-a1d5-2ae096ef8804" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:06.490511 master-0 kubenswrapper[17644]: I0319 12:10:06.490450 17644 status_manager.go:851] "Failed to get status for pod" podUID="8413125cf444e5c95f023c5dd9c6151e" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:06.490974 master-0 kubenswrapper[17644]: I0319 12:10:06.490937 17644 status_manager.go:851] "Failed to get status for pod" podUID="cd89f86c9be90c18d6ac0ac77e416132" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:06.491385 master-0 kubenswrapper[17644]: I0319 12:10:06.491349 17644 status_manager.go:851] "Failed to get status for pod" podUID="7def3099-f487-44d4-a1d5-2ae096ef8804" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:07.158804 master-0 kubenswrapper[17644]: E0319 12:10:07.158595 17644 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{openshift-kube-scheduler-master-0.189e3cd7f976f07c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-master-0,UID:8413125cf444e5c95f023c5dd9c6151e,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 12:10:01.531510908 +0000 UTC m=+635.301468943,LastTimestamp:2026-03-19 12:10:01.531510908 +0000 UTC m=+635.301468943,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 12:10:07.381086 master-0 kubenswrapper[17644]: I0319 12:10:07.380998 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7043072c-879c-4116-b7c7-d5301b9aac2c" Mar 19 12:10:07.381086 master-0 kubenswrapper[17644]: I0319 12:10:07.381061 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7043072c-879c-4116-b7c7-d5301b9aac2c" Mar 19 12:10:07.382391 master-0 kubenswrapper[17644]: E0319 12:10:07.382316 17644 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:08.482759 master-0 kubenswrapper[17644]: I0319 12:10:08.482644 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:08.483858 master-0 kubenswrapper[17644]: I0319 12:10:08.483786 17644 status_manager.go:851] "Failed to get status for pod" podUID="8413125cf444e5c95f023c5dd9c6151e" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:08.484288 master-0 kubenswrapper[17644]: I0319 12:10:08.484247 17644 status_manager.go:851] "Failed to get status for pod" podUID="cd89f86c9be90c18d6ac0ac77e416132" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:08.484708 master-0 kubenswrapper[17644]: I0319 12:10:08.484678 17644 status_manager.go:851] "Failed to get status for pod" podUID="7def3099-f487-44d4-a1d5-2ae096ef8804" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:08.500643 master-0 kubenswrapper[17644]: I0319 12:10:08.500587 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="00c0d353-2320-4015-a01e-c6f8d64c7954" Mar 19 12:10:08.500643 master-0 kubenswrapper[17644]: I0319 12:10:08.500631 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="00c0d353-2320-4015-a01e-c6f8d64c7954" Mar 19 12:10:08.501352 master-0 kubenswrapper[17644]: E0319 12:10:08.501299 17644 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:08.501633 master-0 kubenswrapper[17644]: I0319 12:10:08.501611 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:08.526135 master-0 kubenswrapper[17644]: W0319 12:10:08.525870 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cae843f2a8e3c3c3212b1177305c1d5.slice/crio-46df579573cd0978076ebd09be36d740cbd99dae940a44d1fa6189e22e32bd1b WatchSource:0}: Error finding container 46df579573cd0978076ebd09be36d740cbd99dae940a44d1fa6189e22e32bd1b: Status 404 returned error can't find the container with id 46df579573cd0978076ebd09be36d740cbd99dae940a44d1fa6189e22e32bd1b Mar 19 12:10:09.219493 master-0 kubenswrapper[17644]: E0319 12:10:09.219424 17644 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 19 12:10:09.395013 master-0 kubenswrapper[17644]: I0319 12:10:09.394909 17644 generic.go:334] "Generic (PLEG): container finished" podID="3cae843f2a8e3c3c3212b1177305c1d5" containerID="d71e5ca559a02e3a0851c17edff23c3edcb30e04bfd7a3e3564767eb4411f8ea" exitCode=0 Mar 19 12:10:09.395966 master-0 kubenswrapper[17644]: I0319 12:10:09.394957 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"3cae843f2a8e3c3c3212b1177305c1d5","Type":"ContainerDied","Data":"d71e5ca559a02e3a0851c17edff23c3edcb30e04bfd7a3e3564767eb4411f8ea"} Mar 19 12:10:09.395966 master-0 kubenswrapper[17644]: I0319 12:10:09.395253 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"3cae843f2a8e3c3c3212b1177305c1d5","Type":"ContainerStarted","Data":"46df579573cd0978076ebd09be36d740cbd99dae940a44d1fa6189e22e32bd1b"} Mar 19 12:10:09.395966 master-0 kubenswrapper[17644]: I0319 12:10:09.395557 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="00c0d353-2320-4015-a01e-c6f8d64c7954" Mar 19 12:10:09.395966 master-0 kubenswrapper[17644]: I0319 12:10:09.395573 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="00c0d353-2320-4015-a01e-c6f8d64c7954" Mar 19 12:10:09.396620 master-0 kubenswrapper[17644]: E0319 12:10:09.396382 17644 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:09.396620 master-0 kubenswrapper[17644]: I0319 12:10:09.396464 17644 status_manager.go:851] "Failed to get status for pod" podUID="8413125cf444e5c95f023c5dd9c6151e" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:09.397565 master-0 kubenswrapper[17644]: I0319 12:10:09.397108 17644 status_manager.go:851] "Failed to get status for pod" podUID="cd89f86c9be90c18d6ac0ac77e416132" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:09.397565 master-0 kubenswrapper[17644]: I0319 12:10:09.397533 17644 status_manager.go:851] "Failed to get status for pod" podUID="7def3099-f487-44d4-a1d5-2ae096ef8804" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:10:10.451224 master-0 kubenswrapper[17644]: I0319 12:10:10.451164 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"3cae843f2a8e3c3c3212b1177305c1d5","Type":"ContainerStarted","Data":"9e3292f4678bff852d9ada3675811f07c034e96a1ae9312ba98087ecb6cb3853"} Mar 19 12:10:10.451224 master-0 kubenswrapper[17644]: I0319 12:10:10.451214 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"3cae843f2a8e3c3c3212b1177305c1d5","Type":"ContainerStarted","Data":"19ffecc812f13a6d6b9de1189c28ec5bbbf1a684a856faac25a7bd2f16714226"} Mar 19 12:10:10.451224 master-0 kubenswrapper[17644]: I0319 12:10:10.451225 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"3cae843f2a8e3c3c3212b1177305c1d5","Type":"ContainerStarted","Data":"e84c3915c154a2a62261bf317ce750f94081dbdaa29368dfbf328d8d017316dc"} Mar 19 12:10:10.451224 master-0 kubenswrapper[17644]: I0319 12:10:10.451234 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"3cae843f2a8e3c3c3212b1177305c1d5","Type":"ContainerStarted","Data":"1dd8a8f744ef5197363a1605975d136f861da0d476b4607402a34dbfb4e5e4e5"} Mar 19 12:10:11.461341 master-0 kubenswrapper[17644]: I0319 12:10:11.461244 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"3cae843f2a8e3c3c3212b1177305c1d5","Type":"ContainerStarted","Data":"2cd13575a0a9c6489ff40f286fc1f2db86bde79d0f0d685da83d0f73bbaaed33"} Mar 19 12:10:11.461934 master-0 kubenswrapper[17644]: I0319 12:10:11.461477 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="00c0d353-2320-4015-a01e-c6f8d64c7954" Mar 19 12:10:11.461934 master-0 kubenswrapper[17644]: I0319 12:10:11.461492 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:11.461934 master-0 kubenswrapper[17644]: I0319 12:10:11.461509 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="00c0d353-2320-4015-a01e-c6f8d64c7954" Mar 19 12:10:13.502047 master-0 kubenswrapper[17644]: I0319 12:10:13.501971 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:13.502047 master-0 kubenswrapper[17644]: I0319 12:10:13.502040 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:13.511660 master-0 kubenswrapper[17644]: I0319 12:10:13.511615 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:14.509325 master-0 kubenswrapper[17644]: I0319 12:10:14.509271 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:14.509325 master-0 kubenswrapper[17644]: I0319 12:10:14.509324 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:14.509325 master-0 kubenswrapper[17644]: I0319 12:10:14.509340 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:14.510009 master-0 kubenswrapper[17644]: I0319 12:10:14.509679 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:14.510009 master-0 kubenswrapper[17644]: I0319 12:10:14.509707 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7043072c-879c-4116-b7c7-d5301b9aac2c" Mar 19 12:10:14.510009 master-0 kubenswrapper[17644]: I0319 12:10:14.509744 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7043072c-879c-4116-b7c7-d5301b9aac2c" Mar 19 12:10:14.512417 master-0 kubenswrapper[17644]: I0319 12:10:14.512351 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:14.524039 master-0 kubenswrapper[17644]: I0319 12:10:14.523987 17644 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:15.488338 master-0 kubenswrapper[17644]: I0319 12:10:15.488268 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7043072c-879c-4116-b7c7-d5301b9aac2c" Mar 19 12:10:15.488338 master-0 kubenswrapper[17644]: I0319 12:10:15.488326 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7043072c-879c-4116-b7c7-d5301b9aac2c" Mar 19 12:10:15.491691 master-0 kubenswrapper[17644]: I0319 12:10:15.491647 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:16.134912 master-0 kubenswrapper[17644]: I0319 12:10:16.134851 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:16.502982 master-0 kubenswrapper[17644]: I0319 12:10:16.502927 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7043072c-879c-4116-b7c7-d5301b9aac2c" Mar 19 12:10:16.502982 master-0 kubenswrapper[17644]: I0319 12:10:16.502963 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7043072c-879c-4116-b7c7-d5301b9aac2c" Mar 19 12:10:16.508832 master-0 kubenswrapper[17644]: I0319 12:10:16.508789 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:17.091653 master-0 kubenswrapper[17644]: I0319 12:10:17.091595 17644 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:17.508378 master-0 kubenswrapper[17644]: I0319 12:10:17.508318 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7043072c-879c-4116-b7c7-d5301b9aac2c" Mar 19 12:10:17.508378 master-0 kubenswrapper[17644]: I0319 12:10:17.508351 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7043072c-879c-4116-b7c7-d5301b9aac2c" Mar 19 12:10:18.045173 master-0 kubenswrapper[17644]: I0319 12:10:18.045058 17644 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0d353-2320-4015-a01e-c6f8d64c7954\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T12:10:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T12:10:09Z\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T12:10:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T12:10:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd8a8f744ef5197363a1605975d136f861da0d476b4607402a34dbfb4e5e4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T12:10:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ffecc812f13a6d6b9de1189c28ec5bbbf1a684a856faac25a7bd2f16714226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T12:10:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e84c3915c154a2a62261bf317ce750f94081dbdaa29368dfbf328d8d017316dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T12:10:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd13575a0a9c6489ff40f286fc1f2db86bde79d0f0d685da83d0f73bbaaed33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T12:10:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e3292f4678bff852d9ada3675811f07c034e96a1ae9312ba98087ecb6cb3853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T12:10:10Z\\\"}}}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d71e5ca559a02e3a0851c17edff23c3edcb30e04bfd7a3e3564767eb4411f8ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d71e5ca559a02e3a0851c17edff23c3edcb30e04bfd7a3e3564767eb4411f8ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T12:10:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T12:10:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}]}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-master-0\": pods \"kube-apiserver-master-0\" not found" Mar 19 12:10:18.102259 master-0 kubenswrapper[17644]: I0319 12:10:18.102153 17644 request.go:700] Waited for 1.008508612s, retries: 1, retry-after: 5s - retry-reason: 503 - request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-catalogd/secrets?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dcatalogserver-cert&resourceVersion=15705&timeout=49m26s&timeoutSeconds=2966&watch=true Mar 19 12:10:18.506496 master-0 kubenswrapper[17644]: I0319 12:10:18.506442 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:18.514045 master-0 kubenswrapper[17644]: I0319 12:10:18.513991 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="00c0d353-2320-4015-a01e-c6f8d64c7954" Mar 19 12:10:18.514045 master-0 kubenswrapper[17644]: I0319 12:10:18.514031 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="00c0d353-2320-4015-a01e-c6f8d64c7954" Mar 19 12:10:18.749124 master-0 kubenswrapper[17644]: I0319 12:10:18.749053 17644 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="cd89f86c9be90c18d6ac0ac77e416132" podUID="08613490-cb99-48af-9e68-0d3810f9bb04" Mar 19 12:10:18.817834 master-0 kubenswrapper[17644]: I0319 12:10:18.814120 17644 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="3cae843f2a8e3c3c3212b1177305c1d5" podUID="c7cb0f3b-5010-4e13-8be0-2b6ad3e94ac6" Mar 19 12:10:19.404605 master-0 kubenswrapper[17644]: I0319 12:10:19.404545 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 12:10:19.519820 master-0 kubenswrapper[17644]: I0319 12:10:19.519773 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="00c0d353-2320-4015-a01e-c6f8d64c7954" Mar 19 12:10:19.519820 master-0 kubenswrapper[17644]: I0319 12:10:19.519813 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="00c0d353-2320-4015-a01e-c6f8d64c7954" Mar 19 12:10:20.019808 master-0 kubenswrapper[17644]: I0319 12:10:20.019690 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 19 12:10:20.052561 master-0 kubenswrapper[17644]: I0319 12:10:20.052494 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 12:10:20.054861 master-0 kubenswrapper[17644]: I0319 12:10:20.054816 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 12:10:20.530134 master-0 kubenswrapper[17644]: I0319 12:10:20.530042 17644 generic.go:334] "Generic (PLEG): container finished" podID="8413125cf444e5c95f023c5dd9c6151e" containerID="8c318dcaa73dc8cbe4b4aad8b140d9a7a1894465aa62d9fd6977068c310f6aa4" exitCode=0 Mar 19 12:10:20.530134 master-0 kubenswrapper[17644]: I0319 12:10:20.530126 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerDied","Data":"8c318dcaa73dc8cbe4b4aad8b140d9a7a1894465aa62d9fd6977068c310f6aa4"} Mar 19 12:10:20.530781 master-0 kubenswrapper[17644]: I0319 12:10:20.530534 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="aacc6d86-5946-4308-8e3c-f5dc6bfc7a26" Mar 19 12:10:20.530781 master-0 kubenswrapper[17644]: I0319 12:10:20.530552 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="aacc6d86-5946-4308-8e3c-f5dc6bfc7a26" Mar 19 12:10:20.552076 master-0 kubenswrapper[17644]: I0319 12:10:20.551972 17644 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:10:20.555996 master-0 kubenswrapper[17644]: I0319 12:10:20.555932 17644 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc6d86-5946-4308-8e3c-f5dc6bfc7a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T12:10:20Z\\\",\\\"message\\\":null,\\\"reason\\\":null,\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c318dcaa73dc8cbe4b4aad8b140d9a7a1894465aa62d9fd6977068c310f6aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c318dcaa73dc8cbe4b4aad8b140d9a7a1894465aa62d9fd6977068c310f6aa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T12:10:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T12:10:01Z\\\"}}}]}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-master-0\": pods \"openshift-kube-scheduler-master-0\" not found" Mar 19 12:10:20.634807 master-0 kubenswrapper[17644]: I0319 12:10:20.634718 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-2l456" Mar 19 12:10:20.755534 master-0 kubenswrapper[17644]: I0319 12:10:20.755357 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 19 12:10:20.789400 master-0 kubenswrapper[17644]: I0319 12:10:20.789283 17644 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 12:10:21.031712 master-0 kubenswrapper[17644]: I0319 12:10:21.031417 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 12:10:21.214757 master-0 kubenswrapper[17644]: I0319 12:10:21.214702 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 12:10:21.556705 master-0 kubenswrapper[17644]: I0319 12:10:21.556564 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerStarted","Data":"4e4bd9cd97aeb438fa288f28eb9a2794b55178573d2cb02c878fb095c08d8f77"} Mar 19 12:10:21.556705 master-0 kubenswrapper[17644]: I0319 12:10:21.556636 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerStarted","Data":"bff032679d3fb081299b22ff5b7b62090bb7d276aba9e7f46cfc332fa69cff75"} Mar 19 12:10:21.556705 master-0 kubenswrapper[17644]: I0319 12:10:21.556647 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerStarted","Data":"4919fbf03e4bc58c945190238fdf90e9f90e40bda0761d35f3906638f11d71e6"} Mar 19 12:10:21.557358 master-0 kubenswrapper[17644]: I0319 12:10:21.556979 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:10:21.557358 master-0 kubenswrapper[17644]: I0319 12:10:21.557025 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="aacc6d86-5946-4308-8e3c-f5dc6bfc7a26" Mar 19 12:10:21.557358 master-0 kubenswrapper[17644]: I0319 12:10:21.557046 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="aacc6d86-5946-4308-8e3c-f5dc6bfc7a26" Mar 19 12:10:21.560697 master-0 kubenswrapper[17644]: I0319 12:10:21.560614 17644 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="8413125cf444e5c95f023c5dd9c6151e" podUID="d17977e9-8cc0-4894-ba52-1639331d1681" Mar 19 12:10:22.458037 master-0 kubenswrapper[17644]: I0319 12:10:22.457980 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 12:10:22.563666 master-0 kubenswrapper[17644]: I0319 12:10:22.563610 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="aacc6d86-5946-4308-8e3c-f5dc6bfc7a26" Mar 19 12:10:22.563666 master-0 kubenswrapper[17644]: I0319 12:10:22.563654 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="aacc6d86-5946-4308-8e3c-f5dc6bfc7a26" Mar 19 12:10:22.587162 master-0 kubenswrapper[17644]: I0319 12:10:22.587094 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 12:10:22.588775 master-0 kubenswrapper[17644]: I0319 12:10:22.588724 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 12:10:22.616227 master-0 kubenswrapper[17644]: I0319 12:10:22.616184 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 12:10:22.679319 master-0 kubenswrapper[17644]: I0319 12:10:22.679266 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 12:10:22.758718 master-0 kubenswrapper[17644]: I0319 12:10:22.758614 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 19 12:10:22.774760 master-0 kubenswrapper[17644]: I0319 12:10:22.769914 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 12:10:22.884818 master-0 kubenswrapper[17644]: I0319 12:10:22.884775 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 12:10:22.963326 master-0 kubenswrapper[17644]: I0319 12:10:22.963262 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 12:10:23.055773 master-0 kubenswrapper[17644]: I0319 12:10:23.055654 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 12:10:23.055992 master-0 kubenswrapper[17644]: I0319 12:10:23.055954 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 12:10:23.104333 master-0 kubenswrapper[17644]: I0319 12:10:23.104265 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-7qzrj" Mar 19 12:10:23.155568 master-0 kubenswrapper[17644]: I0319 12:10:23.155485 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-5j36i3nc1dj99" Mar 19 12:10:23.166579 master-0 kubenswrapper[17644]: I0319 12:10:23.166510 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 19 12:10:23.193601 master-0 kubenswrapper[17644]: I0319 12:10:23.193515 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 19 12:10:23.247114 master-0 kubenswrapper[17644]: I0319 12:10:23.247064 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 12:10:23.347338 master-0 kubenswrapper[17644]: I0319 12:10:23.347184 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 12:10:23.378883 master-0 kubenswrapper[17644]: I0319 12:10:23.378821 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 12:10:23.427161 master-0 kubenswrapper[17644]: I0319 12:10:23.427103 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 12:10:23.434598 master-0 kubenswrapper[17644]: I0319 12:10:23.434542 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 12:10:23.507966 master-0 kubenswrapper[17644]: I0319 12:10:23.507921 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 12:10:23.508607 master-0 kubenswrapper[17644]: I0319 12:10:23.508568 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 19 12:10:23.544714 master-0 kubenswrapper[17644]: I0319 12:10:23.544530 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 12:10:23.554908 master-0 kubenswrapper[17644]: I0319 12:10:23.554862 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 19 12:10:23.563571 master-0 kubenswrapper[17644]: I0319 12:10:23.563489 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 19 12:10:23.636759 master-0 kubenswrapper[17644]: I0319 12:10:23.636627 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 12:10:23.637231 master-0 kubenswrapper[17644]: I0319 12:10:23.636872 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 12:10:23.715638 master-0 kubenswrapper[17644]: I0319 12:10:23.715558 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 19 12:10:23.725663 master-0 kubenswrapper[17644]: I0319 12:10:23.725594 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 12:10:23.750577 master-0 kubenswrapper[17644]: I0319 12:10:23.750504 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-6qwsh" Mar 19 12:10:23.802377 master-0 kubenswrapper[17644]: I0319 12:10:23.802062 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 19 12:10:23.919751 master-0 kubenswrapper[17644]: I0319 12:10:23.919511 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 12:10:23.956686 master-0 kubenswrapper[17644]: I0319 12:10:23.956626 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-kn6lc" Mar 19 12:10:23.957133 master-0 kubenswrapper[17644]: I0319 12:10:23.956755 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 12:10:23.963477 master-0 kubenswrapper[17644]: I0319 12:10:23.963365 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-w94s8" Mar 19 12:10:23.965744 master-0 kubenswrapper[17644]: I0319 12:10:23.965681 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 12:10:23.984628 master-0 kubenswrapper[17644]: I0319 12:10:23.984538 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 12:10:24.005402 master-0 kubenswrapper[17644]: I0319 12:10:24.005315 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 12:10:24.051000 master-0 kubenswrapper[17644]: I0319 12:10:24.050935 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 12:10:24.070317 master-0 kubenswrapper[17644]: I0319 12:10:24.070249 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 12:10:24.092497 master-0 kubenswrapper[17644]: I0319 12:10:24.092411 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 12:10:24.110993 master-0 kubenswrapper[17644]: I0319 12:10:24.110909 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 12:10:24.159972 master-0 kubenswrapper[17644]: I0319 12:10:24.159873 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 19 12:10:24.188063 master-0 kubenswrapper[17644]: I0319 12:10:24.187958 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 19 12:10:24.197371 master-0 kubenswrapper[17644]: I0319 12:10:24.197311 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 12:10:24.202960 master-0 kubenswrapper[17644]: I0319 12:10:24.202889 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 12:10:24.285376 master-0 kubenswrapper[17644]: I0319 12:10:24.285264 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-9hcb7" Mar 19 12:10:24.300479 master-0 kubenswrapper[17644]: I0319 12:10:24.300353 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 12:10:24.321499 master-0 kubenswrapper[17644]: I0319 12:10:24.321424 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 12:10:24.328851 master-0 kubenswrapper[17644]: I0319 12:10:24.328810 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 12:10:24.338628 master-0 kubenswrapper[17644]: I0319 12:10:24.338502 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 19 12:10:24.369555 master-0 kubenswrapper[17644]: I0319 12:10:24.369477 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 12:10:24.451263 master-0 kubenswrapper[17644]: I0319 12:10:24.451134 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 19 12:10:24.480452 master-0 kubenswrapper[17644]: I0319 12:10:24.480403 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 19 12:10:24.560933 master-0 kubenswrapper[17644]: I0319 12:10:24.560889 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 12:10:24.565697 master-0 kubenswrapper[17644]: I0319 12:10:24.565681 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-qsmbf" Mar 19 12:10:24.576634 master-0 kubenswrapper[17644]: I0319 12:10:24.576588 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 12:10:24.591319 master-0 kubenswrapper[17644]: I0319 12:10:24.591285 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 19 12:10:24.605516 master-0 kubenswrapper[17644]: I0319 12:10:24.605469 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 12:10:24.613413 master-0 kubenswrapper[17644]: I0319 12:10:24.613371 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 12:10:24.670468 master-0 kubenswrapper[17644]: I0319 12:10:24.670390 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-c95p8" Mar 19 12:10:24.732641 master-0 kubenswrapper[17644]: I0319 12:10:24.732414 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 12:10:24.739844 master-0 kubenswrapper[17644]: I0319 12:10:24.739788 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 19 12:10:24.761892 master-0 kubenswrapper[17644]: I0319 12:10:24.761796 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 12:10:24.838410 master-0 kubenswrapper[17644]: I0319 12:10:24.838342 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 12:10:24.842688 master-0 kubenswrapper[17644]: I0319 12:10:24.842567 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 12:10:24.858702 master-0 kubenswrapper[17644]: I0319 12:10:24.858620 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 12:10:24.889394 master-0 kubenswrapper[17644]: I0319 12:10:24.889319 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 12:10:24.917824 master-0 kubenswrapper[17644]: I0319 12:10:24.917746 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 12:10:24.953721 master-0 kubenswrapper[17644]: I0319 12:10:24.953646 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 19 12:10:24.964832 master-0 kubenswrapper[17644]: I0319 12:10:24.964723 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 12:10:24.976369 master-0 kubenswrapper[17644]: I0319 12:10:24.976295 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 12:10:25.022584 master-0 kubenswrapper[17644]: I0319 12:10:25.022480 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 12:10:25.026185 master-0 kubenswrapper[17644]: I0319 12:10:25.026149 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 12:10:25.026956 master-0 kubenswrapper[17644]: I0319 12:10:25.026929 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 12:10:25.050571 master-0 kubenswrapper[17644]: I0319 12:10:25.050491 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 19 12:10:25.097511 master-0 kubenswrapper[17644]: I0319 12:10:25.097431 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 12:10:25.099248 master-0 kubenswrapper[17644]: I0319 12:10:25.099192 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 19 12:10:25.134450 master-0 kubenswrapper[17644]: I0319 12:10:25.134371 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 12:10:25.180611 master-0 kubenswrapper[17644]: I0319 12:10:25.180532 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 12:10:25.205248 master-0 kubenswrapper[17644]: I0319 12:10:25.205170 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 12:10:25.230787 master-0 kubenswrapper[17644]: I0319 12:10:25.230639 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 12:10:25.239312 master-0 kubenswrapper[17644]: I0319 12:10:25.239261 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 12:10:25.252866 master-0 kubenswrapper[17644]: I0319 12:10:25.252821 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 12:10:25.287629 master-0 kubenswrapper[17644]: I0319 12:10:25.287450 17644 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 12:10:25.290172 master-0 kubenswrapper[17644]: I0319 12:10:25.290128 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 19 12:10:25.297467 master-0 kubenswrapper[17644]: I0319 12:10:25.297391 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-dfb6p" Mar 19 12:10:25.331362 master-0 kubenswrapper[17644]: I0319 12:10:25.331269 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 12:10:25.348621 master-0 kubenswrapper[17644]: I0319 12:10:25.348567 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-zf4zz" Mar 19 12:10:25.365542 master-0 kubenswrapper[17644]: I0319 12:10:25.365471 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 12:10:25.379267 master-0 kubenswrapper[17644]: I0319 12:10:25.379221 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 19 12:10:25.448767 master-0 kubenswrapper[17644]: I0319 12:10:25.448645 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-5jj8d" Mar 19 12:10:25.457545 master-0 kubenswrapper[17644]: I0319 12:10:25.457466 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 19 12:10:25.488570 master-0 kubenswrapper[17644]: I0319 12:10:25.488504 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 12:10:25.565282 master-0 kubenswrapper[17644]: I0319 12:10:25.565179 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 19 12:10:25.577356 master-0 kubenswrapper[17644]: I0319 12:10:25.577291 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 12:10:25.587375 master-0 kubenswrapper[17644]: I0319 12:10:25.587321 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 12:10:25.639462 master-0 kubenswrapper[17644]: I0319 12:10:25.639367 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-68jgh" Mar 19 12:10:25.643059 master-0 kubenswrapper[17644]: I0319 12:10:25.643004 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 12:10:25.662098 master-0 kubenswrapper[17644]: I0319 12:10:25.662027 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 19 12:10:25.724457 master-0 kubenswrapper[17644]: I0319 12:10:25.724372 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 12:10:25.729861 master-0 kubenswrapper[17644]: I0319 12:10:25.729789 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 12:10:25.756310 master-0 kubenswrapper[17644]: I0319 12:10:25.756225 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 12:10:25.817359 master-0 kubenswrapper[17644]: I0319 12:10:25.817164 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 19 12:10:25.817359 master-0 kubenswrapper[17644]: I0319 12:10:25.817313 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 19 12:10:25.868025 master-0 kubenswrapper[17644]: I0319 12:10:25.867941 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 19 12:10:25.872456 master-0 kubenswrapper[17644]: I0319 12:10:25.872402 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 12:10:25.883030 master-0 kubenswrapper[17644]: I0319 12:10:25.882981 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 12:10:26.008428 master-0 kubenswrapper[17644]: I0319 12:10:26.008360 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 12:10:26.023515 master-0 kubenswrapper[17644]: I0319 12:10:26.023428 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 12:10:26.029413 master-0 kubenswrapper[17644]: I0319 12:10:26.029366 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 12:10:26.030330 master-0 kubenswrapper[17644]: I0319 12:10:26.030301 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 12:10:26.036413 master-0 kubenswrapper[17644]: I0319 12:10:26.036384 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 12:10:26.075490 master-0 kubenswrapper[17644]: I0319 12:10:26.075381 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 12:10:26.091870 master-0 kubenswrapper[17644]: I0319 12:10:26.091808 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 19 12:10:26.110241 master-0 kubenswrapper[17644]: I0319 12:10:26.110185 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-zmgxk" Mar 19 12:10:26.139613 master-0 kubenswrapper[17644]: I0319 12:10:26.139546 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 12:10:26.229183 master-0 kubenswrapper[17644]: I0319 12:10:26.229103 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 12:10:26.237745 master-0 kubenswrapper[17644]: I0319 12:10:26.237685 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-b27xz" Mar 19 12:10:26.239788 master-0 kubenswrapper[17644]: I0319 12:10:26.239687 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 12:10:26.265282 master-0 kubenswrapper[17644]: I0319 12:10:26.265200 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 12:10:26.294226 master-0 kubenswrapper[17644]: I0319 12:10:26.294122 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 12:10:26.306508 master-0 kubenswrapper[17644]: I0319 12:10:26.306432 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-cjb2h" Mar 19 12:10:26.316705 master-0 kubenswrapper[17644]: I0319 12:10:26.316608 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-899lw" Mar 19 12:10:26.363764 master-0 kubenswrapper[17644]: I0319 12:10:26.363569 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Mar 19 12:10:26.403901 master-0 kubenswrapper[17644]: I0319 12:10:26.403833 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 12:10:26.420269 master-0 kubenswrapper[17644]: I0319 12:10:26.420216 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 12:10:26.450657 master-0 kubenswrapper[17644]: I0319 12:10:26.450529 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 12:10:26.457688 master-0 kubenswrapper[17644]: I0319 12:10:26.457641 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 19 12:10:26.495767 master-0 kubenswrapper[17644]: I0319 12:10:26.495524 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 12:10:26.496064 master-0 kubenswrapper[17644]: I0319 12:10:26.496010 17644 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="3cae843f2a8e3c3c3212b1177305c1d5" podUID="c7cb0f3b-5010-4e13-8be0-2b6ad3e94ac6" Mar 19 12:10:26.499567 master-0 kubenswrapper[17644]: I0319 12:10:26.499517 17644 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="8413125cf444e5c95f023c5dd9c6151e" podUID="d17977e9-8cc0-4894-ba52-1639331d1681" Mar 19 12:10:26.501578 master-0 kubenswrapper[17644]: I0319 12:10:26.501548 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-pdvk4" Mar 19 12:10:26.553294 master-0 kubenswrapper[17644]: I0319 12:10:26.553225 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 19 12:10:26.556238 master-0 kubenswrapper[17644]: I0319 12:10:26.556070 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 12:10:26.644766 master-0 kubenswrapper[17644]: I0319 12:10:26.644570 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-mv7dz" Mar 19 12:10:26.685685 master-0 kubenswrapper[17644]: I0319 12:10:26.684512 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 12:10:26.690083 master-0 kubenswrapper[17644]: I0319 12:10:26.689906 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 19 12:10:26.692382 master-0 kubenswrapper[17644]: I0319 12:10:26.692336 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 12:10:26.735812 master-0 kubenswrapper[17644]: I0319 12:10:26.735698 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 19 12:10:26.750138 master-0 kubenswrapper[17644]: I0319 12:10:26.750092 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 12:10:26.753764 master-0 kubenswrapper[17644]: I0319 12:10:26.753588 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 12:10:26.783619 master-0 kubenswrapper[17644]: I0319 12:10:26.783134 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 12:10:26.789286 master-0 kubenswrapper[17644]: I0319 12:10:26.789206 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 19 12:10:26.789412 master-0 kubenswrapper[17644]: I0319 12:10:26.789384 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 19 12:10:26.831666 master-0 kubenswrapper[17644]: I0319 12:10:26.831605 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 19 12:10:26.840894 master-0 kubenswrapper[17644]: I0319 12:10:26.840834 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 12:10:26.890856 master-0 kubenswrapper[17644]: I0319 12:10:26.890649 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 12:10:26.914545 master-0 kubenswrapper[17644]: I0319 12:10:26.914446 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 19 12:10:26.952626 master-0 kubenswrapper[17644]: I0319 12:10:26.952543 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 12:10:27.011624 master-0 kubenswrapper[17644]: I0319 12:10:27.011017 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 19 12:10:27.062229 master-0 kubenswrapper[17644]: I0319 12:10:27.062155 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 12:10:27.134670 master-0 kubenswrapper[17644]: I0319 12:10:27.134599 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 12:10:27.144645 master-0 kubenswrapper[17644]: I0319 12:10:27.144586 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 12:10:27.154869 master-0 kubenswrapper[17644]: I0319 12:10:27.154799 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 19 12:10:27.169359 master-0 kubenswrapper[17644]: I0319 12:10:27.169258 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 12:10:27.203707 master-0 kubenswrapper[17644]: I0319 12:10:27.203610 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 12:10:27.212286 master-0 kubenswrapper[17644]: I0319 12:10:27.212232 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 12:10:27.280945 master-0 kubenswrapper[17644]: I0319 12:10:27.280562 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 12:10:27.284687 master-0 kubenswrapper[17644]: I0319 12:10:27.284633 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 12:10:27.333805 master-0 kubenswrapper[17644]: I0319 12:10:27.333631 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 19 12:10:27.335400 master-0 kubenswrapper[17644]: I0319 12:10:27.335331 17644 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 12:10:27.366279 master-0 kubenswrapper[17644]: I0319 12:10:27.366163 17644 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 12:10:27.374294 master-0 kubenswrapper[17644]: I0319 12:10:27.374192 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0","openshift-kube-apiserver/kube-apiserver-master-0","openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 12:10:27.374294 master-0 kubenswrapper[17644]: I0319 12:10:27.374269 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0","openshift-kube-apiserver/kube-apiserver-master-0","openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 12:10:27.374872 master-0 kubenswrapper[17644]: I0319 12:10:27.374584 17644 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="aacc6d86-5946-4308-8e3c-f5dc6bfc7a26" Mar 19 12:10:27.374872 master-0 kubenswrapper[17644]: I0319 12:10:27.374613 17644 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="aacc6d86-5946-4308-8e3c-f5dc6bfc7a26" Mar 19 12:10:27.380952 master-0 kubenswrapper[17644]: I0319 12:10:27.380875 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:27.437757 master-0 kubenswrapper[17644]: I0319 12:10:27.434741 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-kb5zl" Mar 19 12:10:27.437757 master-0 kubenswrapper[17644]: I0319 12:10:27.434967 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 12:10:27.443411 master-0 kubenswrapper[17644]: I0319 12:10:27.442803 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=9.442776881 podStartE2EDuration="9.442776881s" podCreationTimestamp="2026-03-19 12:10:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:10:27.439304868 +0000 UTC m=+661.209262913" watchObservedRunningTime="2026-03-19 12:10:27.442776881 +0000 UTC m=+661.212734906" Mar 19 12:10:27.469479 master-0 kubenswrapper[17644]: I0319 12:10:27.469297 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=7.469278256 podStartE2EDuration="7.469278256s" podCreationTimestamp="2026-03-19 12:10:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:10:27.465164247 +0000 UTC m=+661.235122292" watchObservedRunningTime="2026-03-19 12:10:27.469278256 +0000 UTC m=+661.239236291" Mar 19 12:10:27.492713 master-0 kubenswrapper[17644]: I0319 12:10:27.492580 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=13.492547984 podStartE2EDuration="13.492547984s" podCreationTimestamp="2026-03-19 12:10:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:10:27.481758325 +0000 UTC m=+661.251716360" watchObservedRunningTime="2026-03-19 12:10:27.492547984 +0000 UTC m=+661.262506019" Mar 19 12:10:27.632233 master-0 kubenswrapper[17644]: I0319 12:10:27.632150 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 12:10:27.656026 master-0 kubenswrapper[17644]: I0319 12:10:27.655929 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 19 12:10:27.657895 master-0 kubenswrapper[17644]: I0319 12:10:27.657853 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 12:10:27.660318 master-0 kubenswrapper[17644]: I0319 12:10:27.660277 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 19 12:10:27.689064 master-0 kubenswrapper[17644]: I0319 12:10:27.688932 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 19 12:10:27.703144 master-0 kubenswrapper[17644]: I0319 12:10:27.703068 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 12:10:27.727080 master-0 kubenswrapper[17644]: I0319 12:10:27.726975 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 12:10:27.762135 master-0 kubenswrapper[17644]: I0319 12:10:27.762056 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-6gp54" Mar 19 12:10:27.768253 master-0 kubenswrapper[17644]: I0319 12:10:27.768149 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 19 12:10:27.811702 master-0 kubenswrapper[17644]: I0319 12:10:27.811615 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 19 12:10:27.850945 master-0 kubenswrapper[17644]: I0319 12:10:27.850868 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 19 12:10:27.905029 master-0 kubenswrapper[17644]: I0319 12:10:27.904942 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 12:10:27.915422 master-0 kubenswrapper[17644]: I0319 12:10:27.915349 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 12:10:27.932372 master-0 kubenswrapper[17644]: I0319 12:10:27.932309 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 12:10:28.027003 master-0 kubenswrapper[17644]: I0319 12:10:28.026767 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 12:10:28.029676 master-0 kubenswrapper[17644]: I0319 12:10:28.029593 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 12:10:28.140214 master-0 kubenswrapper[17644]: I0319 12:10:28.140152 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 12:10:28.178487 master-0 kubenswrapper[17644]: I0319 12:10:28.178421 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 19 12:10:28.246575 master-0 kubenswrapper[17644]: I0319 12:10:28.246511 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 19 12:10:28.251531 master-0 kubenswrapper[17644]: I0319 12:10:28.251497 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 19 12:10:28.278981 master-0 kubenswrapper[17644]: I0319 12:10:28.278848 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-8skrb" Mar 19 12:10:28.279159 master-0 kubenswrapper[17644]: I0319 12:10:28.279133 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 12:10:28.280412 master-0 kubenswrapper[17644]: I0319 12:10:28.280354 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 12:10:28.292687 master-0 kubenswrapper[17644]: I0319 12:10:28.292639 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 12:10:28.313664 master-0 kubenswrapper[17644]: I0319 12:10:28.313607 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-6zbld" Mar 19 12:10:28.328208 master-0 kubenswrapper[17644]: I0319 12:10:28.328146 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 19 12:10:28.339440 master-0 kubenswrapper[17644]: I0319 12:10:28.339379 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 12:10:28.366360 master-0 kubenswrapper[17644]: I0319 12:10:28.366282 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 12:10:28.469475 master-0 kubenswrapper[17644]: I0319 12:10:28.469422 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 19 12:10:28.521373 master-0 kubenswrapper[17644]: I0319 12:10:28.521316 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 19 12:10:28.537017 master-0 kubenswrapper[17644]: I0319 12:10:28.536860 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 12:10:28.556038 master-0 kubenswrapper[17644]: I0319 12:10:28.555984 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 12:10:28.565197 master-0 kubenswrapper[17644]: I0319 12:10:28.565105 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-g72px" Mar 19 12:10:28.570334 master-0 kubenswrapper[17644]: I0319 12:10:28.570307 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 19 12:10:28.579607 master-0 kubenswrapper[17644]: I0319 12:10:28.579541 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 19 12:10:28.581982 master-0 kubenswrapper[17644]: I0319 12:10:28.581928 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-bqm2750ms6tma" Mar 19 12:10:28.590664 master-0 kubenswrapper[17644]: I0319 12:10:28.590611 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-sx7wj" Mar 19 12:10:28.616912 master-0 kubenswrapper[17644]: I0319 12:10:28.616839 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-7gswr" Mar 19 12:10:28.631983 master-0 kubenswrapper[17644]: I0319 12:10:28.631922 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 12:10:28.636888 master-0 kubenswrapper[17644]: I0319 12:10:28.636832 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 12:10:28.657587 master-0 kubenswrapper[17644]: I0319 12:10:28.657528 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 12:10:28.673762 master-0 kubenswrapper[17644]: I0319 12:10:28.673393 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 12:10:28.682634 master-0 kubenswrapper[17644]: I0319 12:10:28.682306 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 12:10:28.805599 master-0 kubenswrapper[17644]: I0319 12:10:28.805486 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 12:10:28.815508 master-0 kubenswrapper[17644]: I0319 12:10:28.815437 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 12:10:28.816352 master-0 kubenswrapper[17644]: I0319 12:10:28.816276 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 12:10:28.853528 master-0 kubenswrapper[17644]: I0319 12:10:28.853474 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 19 12:10:28.857241 master-0 kubenswrapper[17644]: I0319 12:10:28.857181 17644 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 12:10:28.857539 master-0 kubenswrapper[17644]: I0319 12:10:28.857498 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="7a4744531cb137d7252790be662d8cc8" containerName="startup-monitor" containerID="cri-o://d5372bed019a717174b80a86dd5805bf70d1c5da596cea00bb090347d2bd1745" gracePeriod=5 Mar 19 12:10:28.875214 master-0 kubenswrapper[17644]: I0319 12:10:28.875155 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 12:10:28.907600 master-0 kubenswrapper[17644]: I0319 12:10:28.905840 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 12:10:28.914012 master-0 kubenswrapper[17644]: I0319 12:10:28.913967 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 19 12:10:28.931699 master-0 kubenswrapper[17644]: I0319 12:10:28.931637 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 12:10:29.092618 master-0 kubenswrapper[17644]: I0319 12:10:29.092483 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 19 12:10:29.141451 master-0 kubenswrapper[17644]: I0319 12:10:29.141394 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 12:10:29.202379 master-0 kubenswrapper[17644]: I0319 12:10:29.202320 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 12:10:29.242492 master-0 kubenswrapper[17644]: I0319 12:10:29.242426 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 12:10:29.282341 master-0 kubenswrapper[17644]: I0319 12:10:29.282247 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-tnp7t" Mar 19 12:10:29.366359 master-0 kubenswrapper[17644]: I0319 12:10:29.366236 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 12:10:29.429191 master-0 kubenswrapper[17644]: I0319 12:10:29.429143 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 12:10:29.459437 master-0 kubenswrapper[17644]: I0319 12:10:29.459383 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-8h56m" Mar 19 12:10:29.570046 master-0 kubenswrapper[17644]: I0319 12:10:29.569995 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 12:10:29.642529 master-0 kubenswrapper[17644]: I0319 12:10:29.642396 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-6nq75" Mar 19 12:10:29.650579 master-0 kubenswrapper[17644]: I0319 12:10:29.650517 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 19 12:10:29.650897 master-0 kubenswrapper[17644]: I0319 12:10:29.650864 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 12:10:29.735569 master-0 kubenswrapper[17644]: I0319 12:10:29.735509 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 12:10:29.800295 master-0 kubenswrapper[17644]: I0319 12:10:29.800252 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 12:10:29.802817 master-0 kubenswrapper[17644]: I0319 12:10:29.802771 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 12:10:29.943566 master-0 kubenswrapper[17644]: I0319 12:10:29.943510 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 12:10:30.014318 master-0 kubenswrapper[17644]: I0319 12:10:30.014270 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 19 12:10:30.197930 master-0 kubenswrapper[17644]: I0319 12:10:30.197802 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 12:10:30.245690 master-0 kubenswrapper[17644]: I0319 12:10:30.245585 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 12:10:30.291608 master-0 kubenswrapper[17644]: I0319 12:10:30.291541 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 12:10:30.512044 master-0 kubenswrapper[17644]: I0319 12:10:30.511940 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 12:10:30.526063 master-0 kubenswrapper[17644]: I0319 12:10:30.526027 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-75w76" Mar 19 12:10:30.689977 master-0 kubenswrapper[17644]: I0319 12:10:30.689907 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 19 12:10:30.722270 master-0 kubenswrapper[17644]: I0319 12:10:30.722201 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 12:10:31.783380 master-0 kubenswrapper[17644]: I0319 12:10:31.783327 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 12:10:31.819928 master-0 kubenswrapper[17644]: I0319 12:10:31.819838 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 12:10:32.051133 master-0 kubenswrapper[17644]: I0319 12:10:32.051001 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 12:10:32.071702 master-0 kubenswrapper[17644]: I0319 12:10:32.071615 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-strbt" Mar 19 12:10:33.112699 master-0 kubenswrapper[17644]: I0319 12:10:33.112618 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 12:10:33.981519 master-0 kubenswrapper[17644]: I0319 12:10:33.981317 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 12:10:34.113014 master-0 kubenswrapper[17644]: I0319 12:10:34.112811 17644 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 12:10:34.264882 master-0 kubenswrapper[17644]: I0319 12:10:34.264847 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 12:10:34.275197 master-0 kubenswrapper[17644]: I0319 12:10:34.275154 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 12:10:34.433159 master-0 kubenswrapper[17644]: I0319 12:10:34.433091 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_7a4744531cb137d7252790be662d8cc8/startup-monitor/0.log" Mar 19 12:10:34.433430 master-0 kubenswrapper[17644]: I0319 12:10:34.433240 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:10:34.532203 master-0 kubenswrapper[17644]: I0319 12:10:34.530701 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-resource-dir\") pod \"7a4744531cb137d7252790be662d8cc8\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " Mar 19 12:10:34.532203 master-0 kubenswrapper[17644]: I0319 12:10:34.531008 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-var-lock\") pod \"7a4744531cb137d7252790be662d8cc8\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " Mar 19 12:10:34.532203 master-0 kubenswrapper[17644]: I0319 12:10:34.531045 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-pod-resource-dir\") pod \"7a4744531cb137d7252790be662d8cc8\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " Mar 19 12:10:34.532203 master-0 kubenswrapper[17644]: I0319 12:10:34.531105 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-manifests\") pod \"7a4744531cb137d7252790be662d8cc8\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " Mar 19 12:10:34.532203 master-0 kubenswrapper[17644]: I0319 12:10:34.531125 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-var-log\") pod \"7a4744531cb137d7252790be662d8cc8\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " Mar 19 12:10:34.532203 master-0 kubenswrapper[17644]: I0319 12:10:34.530888 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "7a4744531cb137d7252790be662d8cc8" (UID: "7a4744531cb137d7252790be662d8cc8"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:10:34.532203 master-0 kubenswrapper[17644]: I0319 12:10:34.531388 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-var-log" (OuterVolumeSpecName: "var-log") pod "7a4744531cb137d7252790be662d8cc8" (UID: "7a4744531cb137d7252790be662d8cc8"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:10:34.532203 master-0 kubenswrapper[17644]: I0319 12:10:34.531427 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-var-lock" (OuterVolumeSpecName: "var-lock") pod "7a4744531cb137d7252790be662d8cc8" (UID: "7a4744531cb137d7252790be662d8cc8"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:10:34.532203 master-0 kubenswrapper[17644]: I0319 12:10:34.531771 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-manifests" (OuterVolumeSpecName: "manifests") pod "7a4744531cb137d7252790be662d8cc8" (UID: "7a4744531cb137d7252790be662d8cc8"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:10:34.536716 master-0 kubenswrapper[17644]: I0319 12:10:34.536664 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "7a4744531cb137d7252790be662d8cc8" (UID: "7a4744531cb137d7252790be662d8cc8"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:10:34.548877 master-0 kubenswrapper[17644]: I0319 12:10:34.548849 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 12:10:34.632820 master-0 kubenswrapper[17644]: I0319 12:10:34.632643 17644 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-var-log\") on node \"master-0\" DevicePath \"\"" Mar 19 12:10:34.632820 master-0 kubenswrapper[17644]: I0319 12:10:34.632705 17644 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:10:34.632820 master-0 kubenswrapper[17644]: I0319 12:10:34.632717 17644 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:10:34.632820 master-0 kubenswrapper[17644]: I0319 12:10:34.632753 17644 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:10:34.632820 master-0 kubenswrapper[17644]: I0319 12:10:34.632763 17644 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-manifests\") on node \"master-0\" DevicePath \"\"" Mar 19 12:10:34.653010 master-0 kubenswrapper[17644]: I0319 12:10:34.652944 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_7a4744531cb137d7252790be662d8cc8/startup-monitor/0.log" Mar 19 12:10:34.653248 master-0 kubenswrapper[17644]: I0319 12:10:34.653015 17644 generic.go:334] "Generic (PLEG): container finished" podID="7a4744531cb137d7252790be662d8cc8" containerID="d5372bed019a717174b80a86dd5805bf70d1c5da596cea00bb090347d2bd1745" exitCode=137 Mar 19 12:10:34.653248 master-0 kubenswrapper[17644]: I0319 12:10:34.653077 17644 scope.go:117] "RemoveContainer" containerID="d5372bed019a717174b80a86dd5805bf70d1c5da596cea00bb090347d2bd1745" Mar 19 12:10:34.653248 master-0 kubenswrapper[17644]: I0319 12:10:34.653097 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:10:34.671570 master-0 kubenswrapper[17644]: I0319 12:10:34.671511 17644 scope.go:117] "RemoveContainer" containerID="d5372bed019a717174b80a86dd5805bf70d1c5da596cea00bb090347d2bd1745" Mar 19 12:10:34.672257 master-0 kubenswrapper[17644]: E0319 12:10:34.672211 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5372bed019a717174b80a86dd5805bf70d1c5da596cea00bb090347d2bd1745\": container with ID starting with d5372bed019a717174b80a86dd5805bf70d1c5da596cea00bb090347d2bd1745 not found: ID does not exist" containerID="d5372bed019a717174b80a86dd5805bf70d1c5da596cea00bb090347d2bd1745" Mar 19 12:10:34.672364 master-0 kubenswrapper[17644]: I0319 12:10:34.672268 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5372bed019a717174b80a86dd5805bf70d1c5da596cea00bb090347d2bd1745"} err="failed to get container status \"d5372bed019a717174b80a86dd5805bf70d1c5da596cea00bb090347d2bd1745\": rpc error: code = NotFound desc = could not find container \"d5372bed019a717174b80a86dd5805bf70d1c5da596cea00bb090347d2bd1745\": container with ID starting with d5372bed019a717174b80a86dd5805bf70d1c5da596cea00bb090347d2bd1745 not found: ID does not exist" Mar 19 12:10:36.172922 master-0 kubenswrapper[17644]: I0319 12:10:36.172854 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 19 12:10:36.495254 master-0 kubenswrapper[17644]: I0319 12:10:36.495196 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a4744531cb137d7252790be662d8cc8" path="/var/lib/kubelet/pods/7a4744531cb137d7252790be662d8cc8/volumes" Mar 19 12:10:37.303890 master-0 kubenswrapper[17644]: I0319 12:10:37.303842 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 12:10:37.889386 master-0 kubenswrapper[17644]: I0319 12:10:37.889337 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 19 12:10:38.238350 master-0 kubenswrapper[17644]: I0319 12:10:38.238310 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 12:10:39.272757 master-0 kubenswrapper[17644]: I0319 12:10:39.272687 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 12:10:39.513526 master-0 kubenswrapper[17644]: I0319 12:10:39.513475 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 12:10:39.544627 master-0 kubenswrapper[17644]: I0319 12:10:39.544523 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 12:10:39.664204 master-0 kubenswrapper[17644]: I0319 12:10:39.664160 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 12:10:39.832348 master-0 kubenswrapper[17644]: I0319 12:10:39.832232 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 19 12:10:40.117778 master-0 kubenswrapper[17644]: I0319 12:10:40.117631 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 12:10:40.384147 master-0 kubenswrapper[17644]: I0319 12:10:40.384040 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 12:10:40.507199 master-0 kubenswrapper[17644]: I0319 12:10:40.507152 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-dockercfg-rb6b2" Mar 19 12:10:40.722874 master-0 kubenswrapper[17644]: I0319 12:10:40.722826 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 19 12:10:40.738700 master-0 kubenswrapper[17644]: I0319 12:10:40.738658 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 12:10:40.762817 master-0 kubenswrapper[17644]: I0319 12:10:40.762766 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 12:10:40.858843 master-0 kubenswrapper[17644]: I0319 12:10:40.858789 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 19 12:10:41.062861 master-0 kubenswrapper[17644]: I0319 12:10:41.062721 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-dnwcp" Mar 19 12:10:41.438332 master-0 kubenswrapper[17644]: I0319 12:10:41.438293 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 12:10:41.613864 master-0 kubenswrapper[17644]: I0319 12:10:41.613808 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 12:10:41.618466 master-0 kubenswrapper[17644]: I0319 12:10:41.618432 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 19 12:10:41.708034 master-0 kubenswrapper[17644]: I0319 12:10:41.707910 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 12:10:41.944566 master-0 kubenswrapper[17644]: I0319 12:10:41.944508 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 12:10:42.130034 master-0 kubenswrapper[17644]: I0319 12:10:42.129908 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 12:10:42.840952 master-0 kubenswrapper[17644]: I0319 12:10:42.840867 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 12:10:42.976178 master-0 kubenswrapper[17644]: I0319 12:10:42.976131 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 12:10:43.104803 master-0 kubenswrapper[17644]: I0319 12:10:43.104669 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 19 12:10:43.314805 master-0 kubenswrapper[17644]: I0319 12:10:43.314750 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 19 12:10:44.133088 master-0 kubenswrapper[17644]: I0319 12:10:44.132953 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 12:10:46.527387 master-0 kubenswrapper[17644]: I0319 12:10:46.527319 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 12:11:10.675478 master-0 kubenswrapper[17644]: I0319 12:11:10.675423 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-7-master-0"] Mar 19 12:11:10.678432 master-0 kubenswrapper[17644]: E0319 12:11:10.675699 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a4744531cb137d7252790be662d8cc8" containerName="startup-monitor" Mar 19 12:11:10.678432 master-0 kubenswrapper[17644]: I0319 12:11:10.675711 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a4744531cb137d7252790be662d8cc8" containerName="startup-monitor" Mar 19 12:11:10.678432 master-0 kubenswrapper[17644]: E0319 12:11:10.675750 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7def3099-f487-44d4-a1d5-2ae096ef8804" containerName="installer" Mar 19 12:11:10.678432 master-0 kubenswrapper[17644]: I0319 12:11:10.675757 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="7def3099-f487-44d4-a1d5-2ae096ef8804" containerName="installer" Mar 19 12:11:10.678432 master-0 kubenswrapper[17644]: I0319 12:11:10.675898 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="7def3099-f487-44d4-a1d5-2ae096ef8804" containerName="installer" Mar 19 12:11:10.678432 master-0 kubenswrapper[17644]: I0319 12:11:10.675913 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a4744531cb137d7252790be662d8cc8" containerName="startup-monitor" Mar 19 12:11:10.678432 master-0 kubenswrapper[17644]: I0319 12:11:10.676325 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-7-master-0" Mar 19 12:11:10.678432 master-0 kubenswrapper[17644]: I0319 12:11:10.678038 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-kpv7f" Mar 19 12:11:10.678432 master-0 kubenswrapper[17644]: I0319 12:11:10.678204 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 12:11:10.686808 master-0 kubenswrapper[17644]: I0319 12:11:10.686542 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-7-master-0"] Mar 19 12:11:10.807179 master-0 kubenswrapper[17644]: I0319 12:11:10.807122 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6daa7b3f-6abd-410f-a040-dcf6bf5521c7-kube-api-access\") pod \"revision-pruner-7-master-0\" (UID: \"6daa7b3f-6abd-410f-a040-dcf6bf5521c7\") " pod="openshift-kube-apiserver/revision-pruner-7-master-0" Mar 19 12:11:10.807363 master-0 kubenswrapper[17644]: I0319 12:11:10.807221 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6daa7b3f-6abd-410f-a040-dcf6bf5521c7-kubelet-dir\") pod \"revision-pruner-7-master-0\" (UID: \"6daa7b3f-6abd-410f-a040-dcf6bf5521c7\") " pod="openshift-kube-apiserver/revision-pruner-7-master-0" Mar 19 12:11:10.908153 master-0 kubenswrapper[17644]: I0319 12:11:10.908116 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6daa7b3f-6abd-410f-a040-dcf6bf5521c7-kube-api-access\") pod \"revision-pruner-7-master-0\" (UID: \"6daa7b3f-6abd-410f-a040-dcf6bf5521c7\") " pod="openshift-kube-apiserver/revision-pruner-7-master-0" Mar 19 12:11:10.908423 master-0 kubenswrapper[17644]: I0319 12:11:10.908407 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6daa7b3f-6abd-410f-a040-dcf6bf5521c7-kubelet-dir\") pod \"revision-pruner-7-master-0\" (UID: \"6daa7b3f-6abd-410f-a040-dcf6bf5521c7\") " pod="openshift-kube-apiserver/revision-pruner-7-master-0" Mar 19 12:11:10.908535 master-0 kubenswrapper[17644]: I0319 12:11:10.908465 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6daa7b3f-6abd-410f-a040-dcf6bf5521c7-kubelet-dir\") pod \"revision-pruner-7-master-0\" (UID: \"6daa7b3f-6abd-410f-a040-dcf6bf5521c7\") " pod="openshift-kube-apiserver/revision-pruner-7-master-0" Mar 19 12:11:10.925798 master-0 kubenswrapper[17644]: I0319 12:11:10.925065 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6daa7b3f-6abd-410f-a040-dcf6bf5521c7-kube-api-access\") pod \"revision-pruner-7-master-0\" (UID: \"6daa7b3f-6abd-410f-a040-dcf6bf5521c7\") " pod="openshift-kube-apiserver/revision-pruner-7-master-0" Mar 19 12:11:11.000664 master-0 kubenswrapper[17644]: I0319 12:11:11.000617 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-7-master-0" Mar 19 12:11:11.307033 master-0 kubenswrapper[17644]: I0319 12:11:11.306960 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 19 12:11:11.313170 master-0 kubenswrapper[17644]: I0319 12:11:11.313103 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 19 12:11:11.480982 master-0 kubenswrapper[17644]: I0319 12:11:11.480715 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-7-master-0"] Mar 19 12:11:11.481094 master-0 kubenswrapper[17644]: W0319 12:11:11.481033 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6daa7b3f_6abd_410f_a040_dcf6bf5521c7.slice/crio-4afbf1f94d516fbb07558d70fa14be873858b5f63c9c52cb79bf8d89015ac3dd WatchSource:0}: Error finding container 4afbf1f94d516fbb07558d70fa14be873858b5f63c9c52cb79bf8d89015ac3dd: Status 404 returned error can't find the container with id 4afbf1f94d516fbb07558d70fa14be873858b5f63c9c52cb79bf8d89015ac3dd Mar 19 12:11:11.507762 master-0 kubenswrapper[17644]: I0319 12:11:11.507704 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:11:11.909610 master-0 kubenswrapper[17644]: I0319 12:11:11.909568 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-7-master-0" event={"ID":"6daa7b3f-6abd-410f-a040-dcf6bf5521c7","Type":"ContainerStarted","Data":"2440154ee7bd3fc8631a64a2159cade11137659870d49c96309fa83d6055f70a"} Mar 19 12:11:11.909610 master-0 kubenswrapper[17644]: I0319 12:11:11.909610 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-7-master-0" event={"ID":"6daa7b3f-6abd-410f-a040-dcf6bf5521c7","Type":"ContainerStarted","Data":"4afbf1f94d516fbb07558d70fa14be873858b5f63c9c52cb79bf8d89015ac3dd"} Mar 19 12:11:11.927961 master-0 kubenswrapper[17644]: I0319 12:11:11.927881 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-7-master-0" podStartSLOduration=1.927863695 podStartE2EDuration="1.927863695s" podCreationTimestamp="2026-03-19 12:11:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:11:11.926687497 +0000 UTC m=+705.696645552" watchObservedRunningTime="2026-03-19 12:11:11.927863695 +0000 UTC m=+705.697821740" Mar 19 12:11:12.493032 master-0 kubenswrapper[17644]: I0319 12:11:12.492951 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e508a43-99db-49eb-bf4e-e3e6a0f49761" path="/var/lib/kubelet/pods/8e508a43-99db-49eb-bf4e-e3e6a0f49761/volumes" Mar 19 12:11:12.917099 master-0 kubenswrapper[17644]: I0319 12:11:12.917048 17644 generic.go:334] "Generic (PLEG): container finished" podID="6daa7b3f-6abd-410f-a040-dcf6bf5521c7" containerID="2440154ee7bd3fc8631a64a2159cade11137659870d49c96309fa83d6055f70a" exitCode=0 Mar 19 12:11:12.917099 master-0 kubenswrapper[17644]: I0319 12:11:12.917095 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-7-master-0" event={"ID":"6daa7b3f-6abd-410f-a040-dcf6bf5521c7","Type":"ContainerDied","Data":"2440154ee7bd3fc8631a64a2159cade11137659870d49c96309fa83d6055f70a"} Mar 19 12:11:14.292460 master-0 kubenswrapper[17644]: I0319 12:11:14.292415 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-7-master-0" Mar 19 12:11:14.465670 master-0 kubenswrapper[17644]: I0319 12:11:14.465551 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6daa7b3f-6abd-410f-a040-dcf6bf5521c7-kubelet-dir\") pod \"6daa7b3f-6abd-410f-a040-dcf6bf5521c7\" (UID: \"6daa7b3f-6abd-410f-a040-dcf6bf5521c7\") " Mar 19 12:11:14.465670 master-0 kubenswrapper[17644]: I0319 12:11:14.465655 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6daa7b3f-6abd-410f-a040-dcf6bf5521c7-kube-api-access\") pod \"6daa7b3f-6abd-410f-a040-dcf6bf5521c7\" (UID: \"6daa7b3f-6abd-410f-a040-dcf6bf5521c7\") " Mar 19 12:11:14.466173 master-0 kubenswrapper[17644]: I0319 12:11:14.465748 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6daa7b3f-6abd-410f-a040-dcf6bf5521c7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6daa7b3f-6abd-410f-a040-dcf6bf5521c7" (UID: "6daa7b3f-6abd-410f-a040-dcf6bf5521c7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:11:14.466173 master-0 kubenswrapper[17644]: I0319 12:11:14.466075 17644 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6daa7b3f-6abd-410f-a040-dcf6bf5521c7-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:11:14.476104 master-0 kubenswrapper[17644]: I0319 12:11:14.475985 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6daa7b3f-6abd-410f-a040-dcf6bf5521c7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6daa7b3f-6abd-410f-a040-dcf6bf5521c7" (UID: "6daa7b3f-6abd-410f-a040-dcf6bf5521c7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:11:14.569086 master-0 kubenswrapper[17644]: I0319 12:11:14.569027 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6daa7b3f-6abd-410f-a040-dcf6bf5521c7-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 12:11:14.933955 master-0 kubenswrapper[17644]: I0319 12:11:14.932695 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-7-master-0" event={"ID":"6daa7b3f-6abd-410f-a040-dcf6bf5521c7","Type":"ContainerDied","Data":"4afbf1f94d516fbb07558d70fa14be873858b5f63c9c52cb79bf8d89015ac3dd"} Mar 19 12:11:14.933955 master-0 kubenswrapper[17644]: I0319 12:11:14.932782 17644 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4afbf1f94d516fbb07558d70fa14be873858b5f63c9c52cb79bf8d89015ac3dd" Mar 19 12:11:14.933955 master-0 kubenswrapper[17644]: I0319 12:11:14.932797 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-7-master-0" Mar 19 12:11:27.014664 master-0 kubenswrapper[17644]: I0319 12:11:27.014534 17644 scope.go:117] "RemoveContainer" containerID="300261e39c3fe1898b1aa4629252d5e05f336f7f74bdf1250eea81121a460d42" Mar 19 12:11:27.031297 master-0 kubenswrapper[17644]: I0319 12:11:27.031265 17644 scope.go:117] "RemoveContainer" containerID="6df3457295116a2e9643f9aa93c1bc33230ddf9f1366aab4d64dcdaedbded1b4" Mar 19 12:11:27.046632 master-0 kubenswrapper[17644]: I0319 12:11:27.046593 17644 scope.go:117] "RemoveContainer" containerID="1700c51a0be3b8389e42a5cf379351f4fa21a1a23cc74be2e934a716c3897cd0" Mar 19 12:12:27.101758 master-0 kubenswrapper[17644]: I0319 12:12:27.101667 17644 scope.go:117] "RemoveContainer" containerID="b444dcaee3b4ca7a60c29c3343ca436c90a224a2cac7695b9a98404124c21d5b" Mar 19 12:12:51.063318 master-0 kubenswrapper[17644]: I0319 12:12:51.063237 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc"] Mar 19 12:12:51.064017 master-0 kubenswrapper[17644]: E0319 12:12:51.063588 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6daa7b3f-6abd-410f-a040-dcf6bf5521c7" containerName="pruner" Mar 19 12:12:51.064017 master-0 kubenswrapper[17644]: I0319 12:12:51.063604 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="6daa7b3f-6abd-410f-a040-dcf6bf5521c7" containerName="pruner" Mar 19 12:12:51.064017 master-0 kubenswrapper[17644]: I0319 12:12:51.063754 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="6daa7b3f-6abd-410f-a040-dcf6bf5521c7" containerName="pruner" Mar 19 12:12:51.064272 master-0 kubenswrapper[17644]: I0319 12:12:51.064239 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.066444 master-0 kubenswrapper[17644]: I0319 12:12:51.066410 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 12:12:51.066522 master-0 kubenswrapper[17644]: I0319 12:12:51.066464 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 12:12:51.070102 master-0 kubenswrapper[17644]: I0319 12:12:51.070062 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 12:12:51.070587 master-0 kubenswrapper[17644]: I0319 12:12:51.070555 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 12:12:51.070895 master-0 kubenswrapper[17644]: I0319 12:12:51.070857 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 12:12:51.071568 master-0 kubenswrapper[17644]: I0319 12:12:51.070859 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 12:12:51.071568 master-0 kubenswrapper[17644]: I0319 12:12:51.071133 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 12:12:51.079014 master-0 kubenswrapper[17644]: I0319 12:12:51.078971 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-sjgm7" Mar 19 12:12:51.079481 master-0 kubenswrapper[17644]: I0319 12:12:51.079331 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.080130 master-0 kubenswrapper[17644]: I0319 12:12:51.080097 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 12:12:51.080247 master-0 kubenswrapper[17644]: I0319 12:12:51.080212 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 12:12:51.080291 master-0 kubenswrapper[17644]: I0319 12:12:51.080269 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 12:12:51.080323 master-0 kubenswrapper[17644]: I0319 12:12:51.080233 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 12:12:51.080354 master-0 kubenswrapper[17644]: I0319 12:12:51.080343 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 12:12:51.085418 master-0 kubenswrapper[17644]: I0319 12:12:51.085342 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 12:12:51.088955 master-0 kubenswrapper[17644]: I0319 12:12:51.088928 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 19 12:12:51.089037 master-0 kubenswrapper[17644]: I0319 12:12:51.088998 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.089780 master-0 kubenswrapper[17644]: I0319 12:12:51.089084 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 12:12:51.089780 master-0 kubenswrapper[17644]: I0319 12:12:51.089405 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 19 12:12:51.089780 master-0 kubenswrapper[17644]: I0319 12:12:51.089479 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 19 12:12:51.089780 master-0 kubenswrapper[17644]: I0319 12:12:51.089570 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 19 12:12:51.089780 master-0 kubenswrapper[17644]: I0319 12:12:51.089642 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 19 12:12:51.089953 master-0 kubenswrapper[17644]: I0319 12:12:51.089919 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 19 12:12:51.090894 master-0 kubenswrapper[17644]: I0319 12:12:51.090872 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 19 12:12:51.093556 master-0 kubenswrapper[17644]: I0319 12:12:51.093531 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 19 12:12:51.093805 master-0 kubenswrapper[17644]: I0319 12:12:51.093741 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 19 12:12:51.093967 master-0 kubenswrapper[17644]: I0319 12:12:51.093942 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 19 12:12:51.094229 master-0 kubenswrapper[17644]: I0319 12:12:51.094201 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 12:12:51.094496 master-0 kubenswrapper[17644]: I0319 12:12:51.094470 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 12:12:51.094789 master-0 kubenswrapper[17644]: I0319 12:12:51.094769 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 19 12:12:51.095060 master-0 kubenswrapper[17644]: I0319 12:12:51.095035 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 19 12:12:51.095103 master-0 kubenswrapper[17644]: I0319 12:12:51.095093 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 19 12:12:51.095481 master-0 kubenswrapper[17644]: I0319 12:12:51.095453 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 19 12:12:51.095635 master-0 kubenswrapper[17644]: I0319 12:12:51.095610 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 19 12:12:51.095759 master-0 kubenswrapper[17644]: I0319 12:12:51.095730 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 19 12:12:51.096271 master-0 kubenswrapper[17644]: I0319 12:12:51.096253 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 19 12:12:51.096418 master-0 kubenswrapper[17644]: I0319 12:12:51.096387 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-aprstf5fs6eqr" Mar 19 12:12:51.103455 master-0 kubenswrapper[17644]: I0319 12:12:51.103418 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 19 12:12:51.104988 master-0 kubenswrapper[17644]: I0319 12:12:51.104962 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 19 12:12:51.147072 master-0 kubenswrapper[17644]: I0319 12:12:51.146999 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc"] Mar 19 12:12:51.153393 master-0 kubenswrapper[17644]: I0319 12:12:51.153325 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 12:12:51.172775 master-0 kubenswrapper[17644]: I0319 12:12:51.172711 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.172917 master-0 kubenswrapper[17644]: I0319 12:12:51.172827 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.172971 master-0 kubenswrapper[17644]: I0319 12:12:51.172933 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt9nb\" (UniqueName: \"kubernetes.io/projected/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-kube-api-access-rt9nb\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.173013 master-0 kubenswrapper[17644]: I0319 12:12:51.172975 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2zkz\" (UniqueName: \"kubernetes.io/projected/282960d1-08a2-4187-8279-2081bfdda059-kube-api-access-q2zkz\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.173053 master-0 kubenswrapper[17644]: I0319 12:12:51.173010 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-config\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.173053 master-0 kubenswrapper[17644]: I0319 12:12:51.173029 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-system-service-ca\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.173162 master-0 kubenswrapper[17644]: I0319 12:12:51.173056 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.173162 master-0 kubenswrapper[17644]: I0319 12:12:51.173088 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.173162 master-0 kubenswrapper[17644]: I0319 12:12:51.173112 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-system-serving-cert\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.173162 master-0 kubenswrapper[17644]: I0319 12:12:51.173134 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ce7ed383-661e-4825-8eb0-ea529a90acc2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.173162 master-0 kubenswrapper[17644]: I0319 12:12:51.173156 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-web-config\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.173349 master-0 kubenswrapper[17644]: I0319 12:12:51.173173 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/282960d1-08a2-4187-8279-2081bfdda059-audit-policies\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.173349 master-0 kubenswrapper[17644]: I0319 12:12:51.173193 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-user-template-error\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.173510 master-0 kubenswrapper[17644]: I0319 12:12:51.173469 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.173573 master-0 kubenswrapper[17644]: I0319 12:12:51.173518 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ce7ed383-661e-4825-8eb0-ea529a90acc2-config-out\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.173573 master-0 kubenswrapper[17644]: I0319 12:12:51.173556 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce7ed383-661e-4825-8eb0-ea529a90acc2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.173652 master-0 kubenswrapper[17644]: I0319 12:12:51.173574 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.173652 master-0 kubenswrapper[17644]: I0319 12:12:51.173592 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce7ed383-661e-4825-8eb0-ea529a90acc2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.173652 master-0 kubenswrapper[17644]: I0319 12:12:51.173613 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-config-out\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.173971 master-0 kubenswrapper[17644]: I0319 12:12:51.173909 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-web-config\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.174046 master-0 kubenswrapper[17644]: I0319 12:12:51.174009 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.174112 master-0 kubenswrapper[17644]: I0319 12:12:51.174078 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.174160 master-0 kubenswrapper[17644]: I0319 12:12:51.174143 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/282960d1-08a2-4187-8279-2081bfdda059-audit-dir\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.174235 master-0 kubenswrapper[17644]: I0319 12:12:51.174197 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ce7ed383-661e-4825-8eb0-ea529a90acc2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.174322 master-0 kubenswrapper[17644]: I0319 12:12:51.174289 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce7ed383-661e-4825-8eb0-ea529a90acc2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.174377 master-0 kubenswrapper[17644]: I0319 12:12:51.174338 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-config-volume\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.174484 master-0 kubenswrapper[17644]: I0319 12:12:51.174447 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.174547 master-0 kubenswrapper[17644]: I0319 12:12:51.174506 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.174584 master-0 kubenswrapper[17644]: I0319 12:12:51.174554 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqtsc\" (UniqueName: \"kubernetes.io/projected/ce7ed383-661e-4825-8eb0-ea529a90acc2-kube-api-access-vqtsc\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.174628 master-0 kubenswrapper[17644]: I0319 12:12:51.174593 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-system-cliconfig\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.174704 master-0 kubenswrapper[17644]: I0319 12:12:51.174665 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.174952 master-0 kubenswrapper[17644]: I0319 12:12:51.174911 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.175015 master-0 kubenswrapper[17644]: I0319 12:12:51.174961 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.175051 master-0 kubenswrapper[17644]: I0319 12:12:51.175031 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-user-template-login\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.175098 master-0 kubenswrapper[17644]: I0319 12:12:51.175069 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.175138 master-0 kubenswrapper[17644]: I0319 12:12:51.175107 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ce7ed383-661e-4825-8eb0-ea529a90acc2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.175169 master-0 kubenswrapper[17644]: I0319 12:12:51.175136 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-system-session\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.175169 master-0 kubenswrapper[17644]: I0319 12:12:51.175160 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-system-router-certs\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.175256 master-0 kubenswrapper[17644]: I0319 12:12:51.175197 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.175293 master-0 kubenswrapper[17644]: I0319 12:12:51.175252 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.175347 master-0 kubenswrapper[17644]: I0319 12:12:51.175293 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ce7ed383-661e-4825-8eb0-ea529a90acc2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.175383 master-0 kubenswrapper[17644]: I0319 12:12:51.175369 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.175446 master-0 kubenswrapper[17644]: I0319 12:12:51.175413 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.277773 master-0 kubenswrapper[17644]: I0319 12:12:51.277649 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-user-template-error\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.277773 master-0 kubenswrapper[17644]: I0319 12:12:51.277741 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.278092 master-0 kubenswrapper[17644]: I0319 12:12:51.278067 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ce7ed383-661e-4825-8eb0-ea529a90acc2-config-out\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.278229 master-0 kubenswrapper[17644]: I0319 12:12:51.278206 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce7ed383-661e-4825-8eb0-ea529a90acc2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.278346 master-0 kubenswrapper[17644]: I0319 12:12:51.278327 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.278449 master-0 kubenswrapper[17644]: I0319 12:12:51.278432 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce7ed383-661e-4825-8eb0-ea529a90acc2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.278554 master-0 kubenswrapper[17644]: I0319 12:12:51.278540 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-config-out\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.278647 master-0 kubenswrapper[17644]: I0319 12:12:51.278635 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-web-config\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.278900 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.278954 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.278990 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/282960d1-08a2-4187-8279-2081bfdda059-audit-dir\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279019 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ce7ed383-661e-4825-8eb0-ea529a90acc2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279048 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce7ed383-661e-4825-8eb0-ea529a90acc2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279075 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-config-volume\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279105 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279137 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279165 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqtsc\" (UniqueName: \"kubernetes.io/projected/ce7ed383-661e-4825-8eb0-ea529a90acc2-kube-api-access-vqtsc\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279201 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-system-cliconfig\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279239 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279264 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279281 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279292 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279358 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-user-template-login\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279383 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279403 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ce7ed383-661e-4825-8eb0-ea529a90acc2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279402 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce7ed383-661e-4825-8eb0-ea529a90acc2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279426 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-system-router-certs\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279447 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-system-session\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279474 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279512 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279539 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ce7ed383-661e-4825-8eb0-ea529a90acc2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279577 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279605 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279627 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279657 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279690 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt9nb\" (UniqueName: \"kubernetes.io/projected/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-kube-api-access-rt9nb\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279707 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2zkz\" (UniqueName: \"kubernetes.io/projected/282960d1-08a2-4187-8279-2081bfdda059-kube-api-access-q2zkz\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279748 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-config\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279764 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-system-service-ca\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279790 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.279782 master-0 kubenswrapper[17644]: I0319 12:12:51.279814 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.281208 master-0 kubenswrapper[17644]: I0319 12:12:51.279831 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-system-serving-cert\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.281208 master-0 kubenswrapper[17644]: I0319 12:12:51.279851 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ce7ed383-661e-4825-8eb0-ea529a90acc2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.281208 master-0 kubenswrapper[17644]: I0319 12:12:51.279869 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-web-config\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.281208 master-0 kubenswrapper[17644]: I0319 12:12:51.279885 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/282960d1-08a2-4187-8279-2081bfdda059-audit-policies\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.281208 master-0 kubenswrapper[17644]: I0319 12:12:51.280058 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce7ed383-661e-4825-8eb0-ea529a90acc2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.281208 master-0 kubenswrapper[17644]: I0319 12:12:51.280195 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.281208 master-0 kubenswrapper[17644]: I0319 12:12:51.280525 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/282960d1-08a2-4187-8279-2081bfdda059-audit-policies\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.281208 master-0 kubenswrapper[17644]: I0319 12:12:51.281062 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ce7ed383-661e-4825-8eb0-ea529a90acc2-config-out\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.281541 master-0 kubenswrapper[17644]: I0319 12:12:51.281496 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.283872 master-0 kubenswrapper[17644]: I0319 12:12:51.283113 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.283872 master-0 kubenswrapper[17644]: I0319 12:12:51.283243 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-config-out\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.283872 master-0 kubenswrapper[17644]: I0319 12:12:51.283704 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-system-service-ca\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.283872 master-0 kubenswrapper[17644]: I0319 12:12:51.283797 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-user-template-error\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.284516 master-0 kubenswrapper[17644]: I0319 12:12:51.284086 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.285254 master-0 kubenswrapper[17644]: I0319 12:12:51.285226 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.285799 master-0 kubenswrapper[17644]: I0319 12:12:51.285766 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.285918 master-0 kubenswrapper[17644]: I0319 12:12:51.285888 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-web-config\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.286501 master-0 kubenswrapper[17644]: I0319 12:12:51.286474 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-system-cliconfig\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.286584 master-0 kubenswrapper[17644]: I0319 12:12:51.286549 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-web-config\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.287069 master-0 kubenswrapper[17644]: I0319 12:12:51.287037 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce7ed383-661e-4825-8eb0-ea529a90acc2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.287411 master-0 kubenswrapper[17644]: I0319 12:12:51.287361 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-config\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.287532 master-0 kubenswrapper[17644]: I0319 12:12:51.287503 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.288139 master-0 kubenswrapper[17644]: I0319 12:12:51.288104 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ce7ed383-661e-4825-8eb0-ea529a90acc2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.289676 master-0 kubenswrapper[17644]: I0319 12:12:51.289642 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ce7ed383-661e-4825-8eb0-ea529a90acc2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.290170 master-0 kubenswrapper[17644]: I0319 12:12:51.290143 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.290242 master-0 kubenswrapper[17644]: I0319 12:12:51.290188 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.290242 master-0 kubenswrapper[17644]: I0319 12:12:51.290238 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/282960d1-08a2-4187-8279-2081bfdda059-audit-dir\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.291710 master-0 kubenswrapper[17644]: I0319 12:12:51.291142 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.291710 master-0 kubenswrapper[17644]: I0319 12:12:51.291513 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.292315 master-0 kubenswrapper[17644]: I0319 12:12:51.292290 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.293467 master-0 kubenswrapper[17644]: I0319 12:12:51.293435 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.293766 master-0 kubenswrapper[17644]: I0319 12:12:51.293586 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ce7ed383-661e-4825-8eb0-ea529a90acc2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.294389 master-0 kubenswrapper[17644]: I0319 12:12:51.294346 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-system-session\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.295012 master-0 kubenswrapper[17644]: I0319 12:12:51.294908 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.295796 master-0 kubenswrapper[17644]: I0319 12:12:51.295420 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.295796 master-0 kubenswrapper[17644]: I0319 12:12:51.295645 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-user-template-login\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.295796 master-0 kubenswrapper[17644]: I0319 12:12:51.295752 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-system-serving-cert\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.296573 master-0 kubenswrapper[17644]: I0319 12:12:51.296541 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.297632 master-0 kubenswrapper[17644]: I0319 12:12:51.297534 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/282960d1-08a2-4187-8279-2081bfdda059-v4-0-config-system-router-certs\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.298360 master-0 kubenswrapper[17644]: I0319 12:12:51.298328 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt9nb\" (UniqueName: \"kubernetes.io/projected/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-kube-api-access-rt9nb\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.298422 master-0 kubenswrapper[17644]: I0319 12:12:51.298384 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ce7ed383-661e-4825-8eb0-ea529a90acc2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.298461 master-0 kubenswrapper[17644]: I0319 12:12:51.298441 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2zkz\" (UniqueName: \"kubernetes.io/projected/282960d1-08a2-4187-8279-2081bfdda059-kube-api-access-q2zkz\") pod \"oauth-openshift-65dbcfd7b7-qq8lc\" (UID: \"282960d1-08a2-4187-8279-2081bfdda059\") " pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.298992 master-0 kubenswrapper[17644]: I0319 12:12:51.298964 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9-config-volume\") pod \"alertmanager-main-0\" (UID: \"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.300003 master-0 kubenswrapper[17644]: I0319 12:12:51.299904 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ce7ed383-661e-4825-8eb0-ea529a90acc2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.304875 master-0 kubenswrapper[17644]: I0319 12:12:51.304846 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqtsc\" (UniqueName: \"kubernetes.io/projected/ce7ed383-661e-4825-8eb0-ea529a90acc2-kube-api-access-vqtsc\") pod \"prometheus-k8s-0\" (UID: \"ce7ed383-661e-4825-8eb0-ea529a90acc2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.443504 master-0 kubenswrapper[17644]: I0319 12:12:51.443427 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:51.468998 master-0 kubenswrapper[17644]: I0319 12:12:51.468913 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:12:51.490303 master-0 kubenswrapper[17644]: I0319 12:12:51.490202 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:12:51.873894 master-0 kubenswrapper[17644]: I0319 12:12:51.873863 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc"] Mar 19 12:12:51.949769 master-0 kubenswrapper[17644]: I0319 12:12:51.949236 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 12:12:51.955173 master-0 kubenswrapper[17644]: W0319 12:12:51.955092 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded7c7fd0_1772_43a4_b4b6_84dfe358f5b9.slice/crio-9f9f09a263972070c78ee004280538821c5b3b431f56384cf586122f5c9cd2e1 WatchSource:0}: Error finding container 9f9f09a263972070c78ee004280538821c5b3b431f56384cf586122f5c9cd2e1: Status 404 returned error can't find the container with id 9f9f09a263972070c78ee004280538821c5b3b431f56384cf586122f5c9cd2e1 Mar 19 12:12:51.955559 master-0 kubenswrapper[17644]: I0319 12:12:51.955518 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 12:12:51.955908 master-0 kubenswrapper[17644]: W0319 12:12:51.955880 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce7ed383_661e_4825_8eb0_ea529a90acc2.slice/crio-d19b5e77a35c55c3ff464504ccd7dab53ca5d92fa8207b25e85be3ee9745eaff WatchSource:0}: Error finding container d19b5e77a35c55c3ff464504ccd7dab53ca5d92fa8207b25e85be3ee9745eaff: Status 404 returned error can't find the container with id d19b5e77a35c55c3ff464504ccd7dab53ca5d92fa8207b25e85be3ee9745eaff Mar 19 12:12:52.639096 master-0 kubenswrapper[17644]: I0319 12:12:52.639035 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" event={"ID":"282960d1-08a2-4187-8279-2081bfdda059","Type":"ContainerStarted","Data":"6be2e4960edd97ed17f3e0907cc21d5b15d6470444d39c7eed453603e47bda1d"} Mar 19 12:12:52.639096 master-0 kubenswrapper[17644]: I0319 12:12:52.639110 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" event={"ID":"282960d1-08a2-4187-8279-2081bfdda059","Type":"ContainerStarted","Data":"fdd35ad69eecccc716e4e52b161252b596385855b81c9c648a6cf81539eb0043"} Mar 19 12:12:52.639647 master-0 kubenswrapper[17644]: I0319 12:12:52.639422 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:52.640691 master-0 kubenswrapper[17644]: I0319 12:12:52.640648 17644 generic.go:334] "Generic (PLEG): container finished" podID="ce7ed383-661e-4825-8eb0-ea529a90acc2" containerID="d74428105e479717bac3b9110357ecf21ce1dda217f2fac028e595ffab989608" exitCode=0 Mar 19 12:12:52.640869 master-0 kubenswrapper[17644]: I0319 12:12:52.640821 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ce7ed383-661e-4825-8eb0-ea529a90acc2","Type":"ContainerDied","Data":"d74428105e479717bac3b9110357ecf21ce1dda217f2fac028e595ffab989608"} Mar 19 12:12:52.641010 master-0 kubenswrapper[17644]: I0319 12:12:52.640992 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ce7ed383-661e-4825-8eb0-ea529a90acc2","Type":"ContainerStarted","Data":"d19b5e77a35c55c3ff464504ccd7dab53ca5d92fa8207b25e85be3ee9745eaff"} Mar 19 12:12:52.642583 master-0 kubenswrapper[17644]: I0319 12:12:52.642558 17644 generic.go:334] "Generic (PLEG): container finished" podID="ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9" containerID="126d60aeb65d9f246cc77ca70c18bfe7da3004f2f5d334f168feadff256efff2" exitCode=0 Mar 19 12:12:52.642645 master-0 kubenswrapper[17644]: I0319 12:12:52.642616 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9","Type":"ContainerDied","Data":"126d60aeb65d9f246cc77ca70c18bfe7da3004f2f5d334f168feadff256efff2"} Mar 19 12:12:52.642645 master-0 kubenswrapper[17644]: I0319 12:12:52.642640 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9","Type":"ContainerStarted","Data":"9f9f09a263972070c78ee004280538821c5b3b431f56384cf586122f5c9cd2e1"} Mar 19 12:12:52.644486 master-0 kubenswrapper[17644]: I0319 12:12:52.644461 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" Mar 19 12:12:52.672403 master-0 kubenswrapper[17644]: I0319 12:12:52.672308 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-65dbcfd7b7-qq8lc" podStartSLOduration=214.672279246 podStartE2EDuration="3m34.672279246s" podCreationTimestamp="2026-03-19 12:09:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:12:52.663542676 +0000 UTC m=+806.433500721" watchObservedRunningTime="2026-03-19 12:12:52.672279246 +0000 UTC m=+806.442237291" Mar 19 12:12:53.661790 master-0 kubenswrapper[17644]: I0319 12:12:53.661283 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9","Type":"ContainerStarted","Data":"85ab08efc619176fabcd35ed9e4a98439a298ebe64fc3d09f4ccbedfc84c33c7"} Mar 19 12:12:53.661790 master-0 kubenswrapper[17644]: I0319 12:12:53.661342 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9","Type":"ContainerStarted","Data":"7c5ed27b18413d0b4bc9aa15408d05191ce3398ecd84fe83a8a4746ef48f8836"} Mar 19 12:12:53.661790 master-0 kubenswrapper[17644]: I0319 12:12:53.661354 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9","Type":"ContainerStarted","Data":"8c3aa0d87b9e17daf4ab368b08d56d29acb91a60d147eab8139c924ecf33eb62"} Mar 19 12:12:53.661790 master-0 kubenswrapper[17644]: I0319 12:12:53.661363 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9","Type":"ContainerStarted","Data":"801d0911bee292f9cd350cf8d846d4ded947789b17ed889f08c04d5bc356abf5"} Mar 19 12:12:53.667388 master-0 kubenswrapper[17644]: I0319 12:12:53.667312 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ce7ed383-661e-4825-8eb0-ea529a90acc2","Type":"ContainerStarted","Data":"47fde9aa4f53c418296650821afc5d094f5c424d8923172760395b662ba4df39"} Mar 19 12:12:53.667388 master-0 kubenswrapper[17644]: I0319 12:12:53.667391 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ce7ed383-661e-4825-8eb0-ea529a90acc2","Type":"ContainerStarted","Data":"077ed7a30ddf8679ec0e0441649889286d38b63a13ff57d0f3edc147831fa1ee"} Mar 19 12:12:53.667630 master-0 kubenswrapper[17644]: I0319 12:12:53.667406 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ce7ed383-661e-4825-8eb0-ea529a90acc2","Type":"ContainerStarted","Data":"3e7755195a2d2767b89b43cbf5cec67019885a5fd666e21392f1d86acd8283f7"} Mar 19 12:12:53.667630 master-0 kubenswrapper[17644]: I0319 12:12:53.667449 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ce7ed383-661e-4825-8eb0-ea529a90acc2","Type":"ContainerStarted","Data":"1de0672e6d5fed1bfb5a3c0535a8080207d23b87516bbc730d56ea8fd7f399c5"} Mar 19 12:12:54.678309 master-0 kubenswrapper[17644]: I0319 12:12:54.678263 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ce7ed383-661e-4825-8eb0-ea529a90acc2","Type":"ContainerStarted","Data":"8da1511cf74cc5672b1f72f4f0e4524d2b90c450c73192339c62e79128956116"} Mar 19 12:12:54.678309 master-0 kubenswrapper[17644]: I0319 12:12:54.678305 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ce7ed383-661e-4825-8eb0-ea529a90acc2","Type":"ContainerStarted","Data":"d123c662bd9cdd57ddf7cb06fd13ef8d940ff365e16b137ab4e3b6571e6c4a21"} Mar 19 12:12:54.690557 master-0 kubenswrapper[17644]: I0319 12:12:54.690511 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9","Type":"ContainerStarted","Data":"8143e94da6da2b7388ed954751b0573bd22fb2542f79fe726d8432104c0fd1cc"} Mar 19 12:12:54.690557 master-0 kubenswrapper[17644]: I0319 12:12:54.690553 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9","Type":"ContainerStarted","Data":"15d53e1bd7ca899c52d7cafb6ff643180960491bb74c843974069aebd323449d"} Mar 19 12:12:54.716075 master-0 kubenswrapper[17644]: I0319 12:12:54.715976 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=139.715956095 podStartE2EDuration="2m19.715956095s" podCreationTimestamp="2026-03-19 12:10:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:12:54.711749883 +0000 UTC m=+808.481707928" watchObservedRunningTime="2026-03-19 12:12:54.715956095 +0000 UTC m=+808.485914150" Mar 19 12:12:54.746486 master-0 kubenswrapper[17644]: I0319 12:12:54.746400 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=185.746378389 podStartE2EDuration="3m5.746378389s" podCreationTimestamp="2026-03-19 12:09:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:12:54.742091256 +0000 UTC m=+808.512049321" watchObservedRunningTime="2026-03-19 12:12:54.746378389 +0000 UTC m=+808.516336444" Mar 19 12:12:56.495541 master-0 kubenswrapper[17644]: I0319 12:12:56.495449 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:51.491490 master-0 kubenswrapper[17644]: I0319 12:13:51.491421 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:51.521843 master-0 kubenswrapper[17644]: I0319 12:13:51.521795 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:52.243000 master-0 kubenswrapper[17644]: I0319 12:13:52.242936 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:14:10.000940 master-0 kubenswrapper[17644]: I0319 12:14:10.000872 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-79d675c7f7-j6qn8"] Mar 19 12:14:10.006114 master-0 kubenswrapper[17644]: I0319 12:14:10.006055 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:10.023035 master-0 kubenswrapper[17644]: I0319 12:14:10.022982 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79d675c7f7-j6qn8"] Mar 19 12:14:10.164772 master-0 kubenswrapper[17644]: I0319 12:14:10.164657 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-service-ca\") pod \"console-79d675c7f7-j6qn8\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:10.165064 master-0 kubenswrapper[17644]: I0319 12:14:10.164818 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-oauth-serving-cert\") pod \"console-79d675c7f7-j6qn8\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:10.165064 master-0 kubenswrapper[17644]: I0319 12:14:10.164857 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-console-serving-cert\") pod \"console-79d675c7f7-j6qn8\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:10.165064 master-0 kubenswrapper[17644]: I0319 12:14:10.164902 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-console-oauth-config\") pod \"console-79d675c7f7-j6qn8\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:10.165064 master-0 kubenswrapper[17644]: I0319 12:14:10.164937 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-console-config\") pod \"console-79d675c7f7-j6qn8\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:10.165248 master-0 kubenswrapper[17644]: I0319 12:14:10.165139 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-trusted-ca-bundle\") pod \"console-79d675c7f7-j6qn8\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:10.165392 master-0 kubenswrapper[17644]: I0319 12:14:10.165356 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vzc7\" (UniqueName: \"kubernetes.io/projected/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-kube-api-access-8vzc7\") pod \"console-79d675c7f7-j6qn8\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:10.267885 master-0 kubenswrapper[17644]: I0319 12:14:10.267345 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-console-oauth-config\") pod \"console-79d675c7f7-j6qn8\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:10.267885 master-0 kubenswrapper[17644]: I0319 12:14:10.267462 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-console-config\") pod \"console-79d675c7f7-j6qn8\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:10.267885 master-0 kubenswrapper[17644]: I0319 12:14:10.267505 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-trusted-ca-bundle\") pod \"console-79d675c7f7-j6qn8\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:10.267885 master-0 kubenswrapper[17644]: I0319 12:14:10.267547 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vzc7\" (UniqueName: \"kubernetes.io/projected/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-kube-api-access-8vzc7\") pod \"console-79d675c7f7-j6qn8\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:10.267885 master-0 kubenswrapper[17644]: I0319 12:14:10.267607 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-service-ca\") pod \"console-79d675c7f7-j6qn8\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:10.267885 master-0 kubenswrapper[17644]: I0319 12:14:10.267681 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-oauth-serving-cert\") pod \"console-79d675c7f7-j6qn8\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:10.267885 master-0 kubenswrapper[17644]: I0319 12:14:10.267713 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-console-serving-cert\") pod \"console-79d675c7f7-j6qn8\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:10.269943 master-0 kubenswrapper[17644]: I0319 12:14:10.269886 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-console-config\") pod \"console-79d675c7f7-j6qn8\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:10.270105 master-0 kubenswrapper[17644]: I0319 12:14:10.270062 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-service-ca\") pod \"console-79d675c7f7-j6qn8\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:10.270674 master-0 kubenswrapper[17644]: I0319 12:14:10.270612 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-trusted-ca-bundle\") pod \"console-79d675c7f7-j6qn8\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:10.271193 master-0 kubenswrapper[17644]: I0319 12:14:10.271138 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-oauth-serving-cert\") pod \"console-79d675c7f7-j6qn8\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:10.273212 master-0 kubenswrapper[17644]: I0319 12:14:10.273158 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-console-oauth-config\") pod \"console-79d675c7f7-j6qn8\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:10.278598 master-0 kubenswrapper[17644]: I0319 12:14:10.278549 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-console-serving-cert\") pod \"console-79d675c7f7-j6qn8\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:10.289221 master-0 kubenswrapper[17644]: I0319 12:14:10.289167 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vzc7\" (UniqueName: \"kubernetes.io/projected/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-kube-api-access-8vzc7\") pod \"console-79d675c7f7-j6qn8\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:10.370233 master-0 kubenswrapper[17644]: I0319 12:14:10.370147 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:10.804084 master-0 kubenswrapper[17644]: I0319 12:14:10.803964 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79d675c7f7-j6qn8"] Mar 19 12:14:10.806646 master-0 kubenswrapper[17644]: W0319 12:14:10.806597 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e9110ab_58dc_4cdc_b5f8_5ed8ba5a4523.slice/crio-cd47fc80fc85c8d2e0f19092f5d2957662547284c5257034e2300e8868387e56 WatchSource:0}: Error finding container cd47fc80fc85c8d2e0f19092f5d2957662547284c5257034e2300e8868387e56: Status 404 returned error can't find the container with id cd47fc80fc85c8d2e0f19092f5d2957662547284c5257034e2300e8868387e56 Mar 19 12:14:11.333264 master-0 kubenswrapper[17644]: I0319 12:14:11.333165 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-79d675c7f7-j6qn8"] Mar 19 12:14:11.358543 master-0 kubenswrapper[17644]: I0319 12:14:11.358489 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79d675c7f7-j6qn8" event={"ID":"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523","Type":"ContainerStarted","Data":"a4160f23177548bdf9324ff1b1f2bd4e29bbd67a2b0a7a97490160a41439a9a0"} Mar 19 12:14:11.358759 master-0 kubenswrapper[17644]: I0319 12:14:11.358555 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79d675c7f7-j6qn8" event={"ID":"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523","Type":"ContainerStarted","Data":"cd47fc80fc85c8d2e0f19092f5d2957662547284c5257034e2300e8868387e56"} Mar 19 12:14:11.388453 master-0 kubenswrapper[17644]: I0319 12:14:11.388392 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-69c7fd464c-4x4r7"] Mar 19 12:14:11.389678 master-0 kubenswrapper[17644]: I0319 12:14:11.389629 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:11.395998 master-0 kubenswrapper[17644]: I0319 12:14:11.395169 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt4b7\" (UniqueName: \"kubernetes.io/projected/ba2a45c7-d196-489e-992c-ed8553206ced-kube-api-access-mt4b7\") pod \"console-69c7fd464c-4x4r7\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:11.395998 master-0 kubenswrapper[17644]: I0319 12:14:11.395225 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba2a45c7-d196-489e-992c-ed8553206ced-oauth-serving-cert\") pod \"console-69c7fd464c-4x4r7\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:11.395998 master-0 kubenswrapper[17644]: I0319 12:14:11.395379 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba2a45c7-d196-489e-992c-ed8553206ced-console-serving-cert\") pod \"console-69c7fd464c-4x4r7\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:11.395998 master-0 kubenswrapper[17644]: I0319 12:14:11.395601 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba2a45c7-d196-489e-992c-ed8553206ced-console-config\") pod \"console-69c7fd464c-4x4r7\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:11.395998 master-0 kubenswrapper[17644]: I0319 12:14:11.395645 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba2a45c7-d196-489e-992c-ed8553206ced-service-ca\") pod \"console-69c7fd464c-4x4r7\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:11.395998 master-0 kubenswrapper[17644]: I0319 12:14:11.395770 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba2a45c7-d196-489e-992c-ed8553206ced-console-oauth-config\") pod \"console-69c7fd464c-4x4r7\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:11.395998 master-0 kubenswrapper[17644]: I0319 12:14:11.395824 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba2a45c7-d196-489e-992c-ed8553206ced-trusted-ca-bundle\") pod \"console-69c7fd464c-4x4r7\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:11.396741 master-0 kubenswrapper[17644]: I0319 12:14:11.396662 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-79d675c7f7-j6qn8" podStartSLOduration=2.396648923 podStartE2EDuration="2.396648923s" podCreationTimestamp="2026-03-19 12:14:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:14:11.384992301 +0000 UTC m=+885.154950356" watchObservedRunningTime="2026-03-19 12:14:11.396648923 +0000 UTC m=+885.166606968" Mar 19 12:14:11.406904 master-0 kubenswrapper[17644]: I0319 12:14:11.406227 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69c7fd464c-4x4r7"] Mar 19 12:14:11.500276 master-0 kubenswrapper[17644]: I0319 12:14:11.500212 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba2a45c7-d196-489e-992c-ed8553206ced-console-oauth-config\") pod \"console-69c7fd464c-4x4r7\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:11.500485 master-0 kubenswrapper[17644]: I0319 12:14:11.500321 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba2a45c7-d196-489e-992c-ed8553206ced-trusted-ca-bundle\") pod \"console-69c7fd464c-4x4r7\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:11.503296 master-0 kubenswrapper[17644]: I0319 12:14:11.502170 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba2a45c7-d196-489e-992c-ed8553206ced-trusted-ca-bundle\") pod \"console-69c7fd464c-4x4r7\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:11.503296 master-0 kubenswrapper[17644]: I0319 12:14:11.502239 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt4b7\" (UniqueName: \"kubernetes.io/projected/ba2a45c7-d196-489e-992c-ed8553206ced-kube-api-access-mt4b7\") pod \"console-69c7fd464c-4x4r7\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:11.503296 master-0 kubenswrapper[17644]: I0319 12:14:11.502288 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba2a45c7-d196-489e-992c-ed8553206ced-oauth-serving-cert\") pod \"console-69c7fd464c-4x4r7\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:11.503296 master-0 kubenswrapper[17644]: I0319 12:14:11.502418 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba2a45c7-d196-489e-992c-ed8553206ced-console-serving-cert\") pod \"console-69c7fd464c-4x4r7\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:11.503296 master-0 kubenswrapper[17644]: I0319 12:14:11.502451 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba2a45c7-d196-489e-992c-ed8553206ced-console-config\") pod \"console-69c7fd464c-4x4r7\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:11.506616 master-0 kubenswrapper[17644]: I0319 12:14:11.503646 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba2a45c7-d196-489e-992c-ed8553206ced-console-config\") pod \"console-69c7fd464c-4x4r7\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:11.506616 master-0 kubenswrapper[17644]: I0319 12:14:11.503700 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba2a45c7-d196-489e-992c-ed8553206ced-service-ca\") pod \"console-69c7fd464c-4x4r7\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:11.506616 master-0 kubenswrapper[17644]: I0319 12:14:11.504577 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba2a45c7-d196-489e-992c-ed8553206ced-service-ca\") pod \"console-69c7fd464c-4x4r7\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:11.506616 master-0 kubenswrapper[17644]: I0319 12:14:11.504773 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba2a45c7-d196-489e-992c-ed8553206ced-oauth-serving-cert\") pod \"console-69c7fd464c-4x4r7\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:11.506616 master-0 kubenswrapper[17644]: I0319 12:14:11.504823 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba2a45c7-d196-489e-992c-ed8553206ced-console-oauth-config\") pod \"console-69c7fd464c-4x4r7\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:11.510078 master-0 kubenswrapper[17644]: I0319 12:14:11.510038 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba2a45c7-d196-489e-992c-ed8553206ced-console-serving-cert\") pod \"console-69c7fd464c-4x4r7\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:11.518805 master-0 kubenswrapper[17644]: I0319 12:14:11.518145 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt4b7\" (UniqueName: \"kubernetes.io/projected/ba2a45c7-d196-489e-992c-ed8553206ced-kube-api-access-mt4b7\") pod \"console-69c7fd464c-4x4r7\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:11.712034 master-0 kubenswrapper[17644]: I0319 12:14:11.711935 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:12.135175 master-0 kubenswrapper[17644]: I0319 12:14:12.135042 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69c7fd464c-4x4r7"] Mar 19 12:14:12.143950 master-0 kubenswrapper[17644]: W0319 12:14:12.143776 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba2a45c7_d196_489e_992c_ed8553206ced.slice/crio-654dda2a3d2db2d059c73bd1494c9dce4722f56cdd611e08a15a178bda097a1b WatchSource:0}: Error finding container 654dda2a3d2db2d059c73bd1494c9dce4722f56cdd611e08a15a178bda097a1b: Status 404 returned error can't find the container with id 654dda2a3d2db2d059c73bd1494c9dce4722f56cdd611e08a15a178bda097a1b Mar 19 12:14:12.368295 master-0 kubenswrapper[17644]: I0319 12:14:12.368207 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69c7fd464c-4x4r7" event={"ID":"ba2a45c7-d196-489e-992c-ed8553206ced","Type":"ContainerStarted","Data":"cf0853f83eda5d377b3accb257e570154d9a2bfd451ff4a0d2e0c016ec32b8bb"} Mar 19 12:14:12.368295 master-0 kubenswrapper[17644]: I0319 12:14:12.368302 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69c7fd464c-4x4r7" event={"ID":"ba2a45c7-d196-489e-992c-ed8553206ced","Type":"ContainerStarted","Data":"654dda2a3d2db2d059c73bd1494c9dce4722f56cdd611e08a15a178bda097a1b"} Mar 19 12:14:12.398179 master-0 kubenswrapper[17644]: I0319 12:14:12.397905 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69c7fd464c-4x4r7" podStartSLOduration=1.397866299 podStartE2EDuration="1.397866299s" podCreationTimestamp="2026-03-19 12:14:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:14:12.391768102 +0000 UTC m=+886.161726167" watchObservedRunningTime="2026-03-19 12:14:12.397866299 +0000 UTC m=+886.167824344" Mar 19 12:14:20.371161 master-0 kubenswrapper[17644]: I0319 12:14:20.370963 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:21.712631 master-0 kubenswrapper[17644]: I0319 12:14:21.712564 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:21.712631 master-0 kubenswrapper[17644]: I0319 12:14:21.712630 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:21.717227 master-0 kubenswrapper[17644]: I0319 12:14:21.717186 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:22.473526 master-0 kubenswrapper[17644]: I0319 12:14:22.473438 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:14:22.579831 master-0 kubenswrapper[17644]: I0319 12:14:22.577821 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7db659c55f-mfdrv"] Mar 19 12:14:27.170052 master-0 kubenswrapper[17644]: I0319 12:14:27.169927 17644 scope.go:117] "RemoveContainer" containerID="a1820a5c08e897a9a826fb2120795cc3a6c64a34860ea5d00dda9abbdf9766f3" Mar 19 12:14:37.411316 master-0 kubenswrapper[17644]: I0319 12:14:37.411241 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-79d675c7f7-j6qn8" podUID="9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523" containerName="console" containerID="cri-o://a4160f23177548bdf9324ff1b1f2bd4e29bbd67a2b0a7a97490160a41439a9a0" gracePeriod=15 Mar 19 12:14:37.616995 master-0 kubenswrapper[17644]: I0319 12:14:37.616937 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79d675c7f7-j6qn8_9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523/console/0.log" Mar 19 12:14:37.617219 master-0 kubenswrapper[17644]: I0319 12:14:37.617008 17644 generic.go:334] "Generic (PLEG): container finished" podID="9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523" containerID="a4160f23177548bdf9324ff1b1f2bd4e29bbd67a2b0a7a97490160a41439a9a0" exitCode=2 Mar 19 12:14:37.617219 master-0 kubenswrapper[17644]: I0319 12:14:37.617050 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79d675c7f7-j6qn8" event={"ID":"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523","Type":"ContainerDied","Data":"a4160f23177548bdf9324ff1b1f2bd4e29bbd67a2b0a7a97490160a41439a9a0"} Mar 19 12:14:37.811490 master-0 kubenswrapper[17644]: I0319 12:14:37.811424 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79d675c7f7-j6qn8_9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523/console/0.log" Mar 19 12:14:37.811490 master-0 kubenswrapper[17644]: I0319 12:14:37.811490 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:37.890054 master-0 kubenswrapper[17644]: I0319 12:14:37.889898 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-console-oauth-config\") pod \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " Mar 19 12:14:37.890054 master-0 kubenswrapper[17644]: I0319 12:14:37.890011 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-service-ca\") pod \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " Mar 19 12:14:37.890630 master-0 kubenswrapper[17644]: I0319 12:14:37.890596 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vzc7\" (UniqueName: \"kubernetes.io/projected/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-kube-api-access-8vzc7\") pod \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " Mar 19 12:14:37.890698 master-0 kubenswrapper[17644]: I0319 12:14:37.890637 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-oauth-serving-cert\") pod \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " Mar 19 12:14:37.890776 master-0 kubenswrapper[17644]: I0319 12:14:37.890712 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-console-serving-cert\") pod \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " Mar 19 12:14:37.890828 master-0 kubenswrapper[17644]: I0319 12:14:37.890792 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-trusted-ca-bundle\") pod \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " Mar 19 12:14:37.890876 master-0 kubenswrapper[17644]: I0319 12:14:37.890700 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-service-ca" (OuterVolumeSpecName: "service-ca") pod "9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523" (UID: "9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:14:37.891424 master-0 kubenswrapper[17644]: I0319 12:14:37.891381 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523" (UID: "9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:14:37.891424 master-0 kubenswrapper[17644]: I0319 12:14:37.891393 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523" (UID: "9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:14:37.892436 master-0 kubenswrapper[17644]: I0319 12:14:37.892389 17644 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 12:14:37.892680 master-0 kubenswrapper[17644]: I0319 12:14:37.892448 17644 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:14:37.892680 master-0 kubenswrapper[17644]: I0319 12:14:37.892466 17644 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:14:37.893340 master-0 kubenswrapper[17644]: I0319 12:14:37.893281 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523" (UID: "9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:14:37.894824 master-0 kubenswrapper[17644]: I0319 12:14:37.894670 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-kube-api-access-8vzc7" (OuterVolumeSpecName: "kube-api-access-8vzc7") pod "9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523" (UID: "9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523"). InnerVolumeSpecName "kube-api-access-8vzc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:14:37.895387 master-0 kubenswrapper[17644]: I0319 12:14:37.895361 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523" (UID: "9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:14:37.994602 master-0 kubenswrapper[17644]: I0319 12:14:37.994503 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-console-config\") pod \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\" (UID: \"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523\") " Mar 19 12:14:37.995264 master-0 kubenswrapper[17644]: I0319 12:14:37.995205 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-console-config" (OuterVolumeSpecName: "console-config") pod "9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523" (UID: "9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:14:37.995485 master-0 kubenswrapper[17644]: I0319 12:14:37.995449 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vzc7\" (UniqueName: \"kubernetes.io/projected/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-kube-api-access-8vzc7\") on node \"master-0\" DevicePath \"\"" Mar 19 12:14:37.995485 master-0 kubenswrapper[17644]: I0319 12:14:37.995475 17644 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:14:37.995575 master-0 kubenswrapper[17644]: I0319 12:14:37.995511 17644 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-console-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:14:37.995575 master-0 kubenswrapper[17644]: I0319 12:14:37.995529 17644 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:14:38.624433 master-0 kubenswrapper[17644]: I0319 12:14:38.624401 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79d675c7f7-j6qn8_9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523/console/0.log" Mar 19 12:14:38.625017 master-0 kubenswrapper[17644]: I0319 12:14:38.624994 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79d675c7f7-j6qn8" event={"ID":"9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523","Type":"ContainerDied","Data":"cd47fc80fc85c8d2e0f19092f5d2957662547284c5257034e2300e8868387e56"} Mar 19 12:14:38.625108 master-0 kubenswrapper[17644]: I0319 12:14:38.625058 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79d675c7f7-j6qn8" Mar 19 12:14:38.625195 master-0 kubenswrapper[17644]: I0319 12:14:38.625096 17644 scope.go:117] "RemoveContainer" containerID="a4160f23177548bdf9324ff1b1f2bd4e29bbd67a2b0a7a97490160a41439a9a0" Mar 19 12:14:38.667884 master-0 kubenswrapper[17644]: I0319 12:14:38.667698 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-79d675c7f7-j6qn8"] Mar 19 12:14:38.687148 master-0 kubenswrapper[17644]: I0319 12:14:38.687051 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-79d675c7f7-j6qn8"] Mar 19 12:14:40.492206 master-0 kubenswrapper[17644]: I0319 12:14:40.492159 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523" path="/var/lib/kubelet/pods/9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523/volumes" Mar 19 12:14:47.625108 master-0 kubenswrapper[17644]: I0319 12:14:47.625049 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-7db659c55f-mfdrv" podUID="5696b619-f43e-47a7-b557-5a6abc07cd2a" containerName="console" containerID="cri-o://3a8deaa0c3798324d343c7c041752a4bee4b0423fa2d0ec1b457a6a15b270bb4" gracePeriod=15 Mar 19 12:14:48.051522 master-0 kubenswrapper[17644]: I0319 12:14:48.051478 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7db659c55f-mfdrv_5696b619-f43e-47a7-b557-5a6abc07cd2a/console/0.log" Mar 19 12:14:48.051767 master-0 kubenswrapper[17644]: I0319 12:14:48.051551 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:14:48.163672 master-0 kubenswrapper[17644]: I0319 12:14:48.163601 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5696b619-f43e-47a7-b557-5a6abc07cd2a-console-serving-cert\") pod \"5696b619-f43e-47a7-b557-5a6abc07cd2a\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " Mar 19 12:14:48.163672 master-0 kubenswrapper[17644]: I0319 12:14:48.163659 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5696b619-f43e-47a7-b557-5a6abc07cd2a-trusted-ca-bundle\") pod \"5696b619-f43e-47a7-b557-5a6abc07cd2a\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " Mar 19 12:14:48.164090 master-0 kubenswrapper[17644]: I0319 12:14:48.163714 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5696b619-f43e-47a7-b557-5a6abc07cd2a-console-config\") pod \"5696b619-f43e-47a7-b557-5a6abc07cd2a\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " Mar 19 12:14:48.164090 master-0 kubenswrapper[17644]: I0319 12:14:48.163796 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5696b619-f43e-47a7-b557-5a6abc07cd2a-service-ca\") pod \"5696b619-f43e-47a7-b557-5a6abc07cd2a\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " Mar 19 12:14:48.164090 master-0 kubenswrapper[17644]: I0319 12:14:48.163908 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5696b619-f43e-47a7-b557-5a6abc07cd2a-oauth-serving-cert\") pod \"5696b619-f43e-47a7-b557-5a6abc07cd2a\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " Mar 19 12:14:48.164650 master-0 kubenswrapper[17644]: I0319 12:14:48.164465 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5696b619-f43e-47a7-b557-5a6abc07cd2a-console-config" (OuterVolumeSpecName: "console-config") pod "5696b619-f43e-47a7-b557-5a6abc07cd2a" (UID: "5696b619-f43e-47a7-b557-5a6abc07cd2a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:14:48.164741 master-0 kubenswrapper[17644]: I0319 12:14:48.164653 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5696b619-f43e-47a7-b557-5a6abc07cd2a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5696b619-f43e-47a7-b557-5a6abc07cd2a" (UID: "5696b619-f43e-47a7-b557-5a6abc07cd2a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:14:48.164741 master-0 kubenswrapper[17644]: I0319 12:14:48.164484 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5696b619-f43e-47a7-b557-5a6abc07cd2a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5696b619-f43e-47a7-b557-5a6abc07cd2a" (UID: "5696b619-f43e-47a7-b557-5a6abc07cd2a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:14:48.164880 master-0 kubenswrapper[17644]: I0319 12:14:48.164841 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5696b619-f43e-47a7-b557-5a6abc07cd2a-service-ca" (OuterVolumeSpecName: "service-ca") pod "5696b619-f43e-47a7-b557-5a6abc07cd2a" (UID: "5696b619-f43e-47a7-b557-5a6abc07cd2a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:14:48.164988 master-0 kubenswrapper[17644]: I0319 12:14:48.164942 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5696b619-f43e-47a7-b557-5a6abc07cd2a-console-oauth-config\") pod \"5696b619-f43e-47a7-b557-5a6abc07cd2a\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " Mar 19 12:14:48.165612 master-0 kubenswrapper[17644]: I0319 12:14:48.165569 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66v85\" (UniqueName: \"kubernetes.io/projected/5696b619-f43e-47a7-b557-5a6abc07cd2a-kube-api-access-66v85\") pod \"5696b619-f43e-47a7-b557-5a6abc07cd2a\" (UID: \"5696b619-f43e-47a7-b557-5a6abc07cd2a\") " Mar 19 12:14:48.166426 master-0 kubenswrapper[17644]: I0319 12:14:48.166377 17644 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5696b619-f43e-47a7-b557-5a6abc07cd2a-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:14:48.166426 master-0 kubenswrapper[17644]: I0319 12:14:48.166411 17644 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5696b619-f43e-47a7-b557-5a6abc07cd2a-console-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:14:48.166426 master-0 kubenswrapper[17644]: I0319 12:14:48.166424 17644 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5696b619-f43e-47a7-b557-5a6abc07cd2a-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 12:14:48.166591 master-0 kubenswrapper[17644]: I0319 12:14:48.166438 17644 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5696b619-f43e-47a7-b557-5a6abc07cd2a-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:14:48.168208 master-0 kubenswrapper[17644]: I0319 12:14:48.168178 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5696b619-f43e-47a7-b557-5a6abc07cd2a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5696b619-f43e-47a7-b557-5a6abc07cd2a" (UID: "5696b619-f43e-47a7-b557-5a6abc07cd2a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:14:48.168363 master-0 kubenswrapper[17644]: I0319 12:14:48.168311 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5696b619-f43e-47a7-b557-5a6abc07cd2a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5696b619-f43e-47a7-b557-5a6abc07cd2a" (UID: "5696b619-f43e-47a7-b557-5a6abc07cd2a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:14:48.168769 master-0 kubenswrapper[17644]: I0319 12:14:48.168707 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5696b619-f43e-47a7-b557-5a6abc07cd2a-kube-api-access-66v85" (OuterVolumeSpecName: "kube-api-access-66v85") pod "5696b619-f43e-47a7-b557-5a6abc07cd2a" (UID: "5696b619-f43e-47a7-b557-5a6abc07cd2a"). InnerVolumeSpecName "kube-api-access-66v85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:14:48.268509 master-0 kubenswrapper[17644]: I0319 12:14:48.268293 17644 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5696b619-f43e-47a7-b557-5a6abc07cd2a-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:14:48.268509 master-0 kubenswrapper[17644]: I0319 12:14:48.268368 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66v85\" (UniqueName: \"kubernetes.io/projected/5696b619-f43e-47a7-b557-5a6abc07cd2a-kube-api-access-66v85\") on node \"master-0\" DevicePath \"\"" Mar 19 12:14:48.268509 master-0 kubenswrapper[17644]: I0319 12:14:48.268380 17644 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5696b619-f43e-47a7-b557-5a6abc07cd2a-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:14:48.721428 master-0 kubenswrapper[17644]: I0319 12:14:48.721350 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7db659c55f-mfdrv_5696b619-f43e-47a7-b557-5a6abc07cd2a/console/0.log" Mar 19 12:14:48.722203 master-0 kubenswrapper[17644]: I0319 12:14:48.721451 17644 generic.go:334] "Generic (PLEG): container finished" podID="5696b619-f43e-47a7-b557-5a6abc07cd2a" containerID="3a8deaa0c3798324d343c7c041752a4bee4b0423fa2d0ec1b457a6a15b270bb4" exitCode=2 Mar 19 12:14:48.722203 master-0 kubenswrapper[17644]: I0319 12:14:48.721512 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7db659c55f-mfdrv" event={"ID":"5696b619-f43e-47a7-b557-5a6abc07cd2a","Type":"ContainerDied","Data":"3a8deaa0c3798324d343c7c041752a4bee4b0423fa2d0ec1b457a6a15b270bb4"} Mar 19 12:14:48.722203 master-0 kubenswrapper[17644]: I0319 12:14:48.721566 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7db659c55f-mfdrv" event={"ID":"5696b619-f43e-47a7-b557-5a6abc07cd2a","Type":"ContainerDied","Data":"eb1ac23240d2fb118e9a10c9a5060cc9fb1f8178658db8704746ed8463a63f5f"} Mar 19 12:14:48.722203 master-0 kubenswrapper[17644]: I0319 12:14:48.721603 17644 scope.go:117] "RemoveContainer" containerID="3a8deaa0c3798324d343c7c041752a4bee4b0423fa2d0ec1b457a6a15b270bb4" Mar 19 12:14:48.722203 master-0 kubenswrapper[17644]: I0319 12:14:48.721954 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7db659c55f-mfdrv" Mar 19 12:14:48.757680 master-0 kubenswrapper[17644]: I0319 12:14:48.757624 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7db659c55f-mfdrv"] Mar 19 12:14:48.763124 master-0 kubenswrapper[17644]: I0319 12:14:48.763068 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7db659c55f-mfdrv"] Mar 19 12:14:48.764432 master-0 kubenswrapper[17644]: I0319 12:14:48.764396 17644 scope.go:117] "RemoveContainer" containerID="3a8deaa0c3798324d343c7c041752a4bee4b0423fa2d0ec1b457a6a15b270bb4" Mar 19 12:14:48.765049 master-0 kubenswrapper[17644]: E0319 12:14:48.765011 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a8deaa0c3798324d343c7c041752a4bee4b0423fa2d0ec1b457a6a15b270bb4\": container with ID starting with 3a8deaa0c3798324d343c7c041752a4bee4b0423fa2d0ec1b457a6a15b270bb4 not found: ID does not exist" containerID="3a8deaa0c3798324d343c7c041752a4bee4b0423fa2d0ec1b457a6a15b270bb4" Mar 19 12:14:48.765103 master-0 kubenswrapper[17644]: I0319 12:14:48.765074 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a8deaa0c3798324d343c7c041752a4bee4b0423fa2d0ec1b457a6a15b270bb4"} err="failed to get container status \"3a8deaa0c3798324d343c7c041752a4bee4b0423fa2d0ec1b457a6a15b270bb4\": rpc error: code = NotFound desc = could not find container \"3a8deaa0c3798324d343c7c041752a4bee4b0423fa2d0ec1b457a6a15b270bb4\": container with ID starting with 3a8deaa0c3798324d343c7c041752a4bee4b0423fa2d0ec1b457a6a15b270bb4 not found: ID does not exist" Mar 19 12:14:50.494207 master-0 kubenswrapper[17644]: I0319 12:14:50.494127 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5696b619-f43e-47a7-b557-5a6abc07cd2a" path="/var/lib/kubelet/pods/5696b619-f43e-47a7-b557-5a6abc07cd2a/volumes" Mar 19 12:15:03.008316 master-0 kubenswrapper[17644]: I0319 12:15:03.008243 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l"] Mar 19 12:15:03.009218 master-0 kubenswrapper[17644]: E0319 12:15:03.008643 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5696b619-f43e-47a7-b557-5a6abc07cd2a" containerName="console" Mar 19 12:15:03.009218 master-0 kubenswrapper[17644]: I0319 12:15:03.008660 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="5696b619-f43e-47a7-b557-5a6abc07cd2a" containerName="console" Mar 19 12:15:03.009218 master-0 kubenswrapper[17644]: E0319 12:15:03.008683 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523" containerName="console" Mar 19 12:15:03.009218 master-0 kubenswrapper[17644]: I0319 12:15:03.008692 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523" containerName="console" Mar 19 12:15:03.009218 master-0 kubenswrapper[17644]: I0319 12:15:03.008882 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="5696b619-f43e-47a7-b557-5a6abc07cd2a" containerName="console" Mar 19 12:15:03.009218 master-0 kubenswrapper[17644]: I0319 12:15:03.008901 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e9110ab-58dc-4cdc-b5f8-5ed8ba5a4523" containerName="console" Mar 19 12:15:03.010038 master-0 kubenswrapper[17644]: I0319 12:15:03.010005 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l" Mar 19 12:15:03.026258 master-0 kubenswrapper[17644]: I0319 12:15:03.026180 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l"] Mar 19 12:15:03.211967 master-0 kubenswrapper[17644]: I0319 12:15:03.211879 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blk94\" (UniqueName: \"kubernetes.io/projected/1c8bb9fa-3dd1-4913-897f-780c4183773d-kube-api-access-blk94\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l\" (UID: \"1c8bb9fa-3dd1-4913-897f-780c4183773d\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l" Mar 19 12:15:03.212420 master-0 kubenswrapper[17644]: I0319 12:15:03.212071 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c8bb9fa-3dd1-4913-897f-780c4183773d-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l\" (UID: \"1c8bb9fa-3dd1-4913-897f-780c4183773d\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l" Mar 19 12:15:03.212560 master-0 kubenswrapper[17644]: I0319 12:15:03.212511 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c8bb9fa-3dd1-4913-897f-780c4183773d-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l\" (UID: \"1c8bb9fa-3dd1-4913-897f-780c4183773d\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l" Mar 19 12:15:03.314239 master-0 kubenswrapper[17644]: I0319 12:15:03.313987 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c8bb9fa-3dd1-4913-897f-780c4183773d-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l\" (UID: \"1c8bb9fa-3dd1-4913-897f-780c4183773d\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l" Mar 19 12:15:03.314239 master-0 kubenswrapper[17644]: I0319 12:15:03.314152 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blk94\" (UniqueName: \"kubernetes.io/projected/1c8bb9fa-3dd1-4913-897f-780c4183773d-kube-api-access-blk94\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l\" (UID: \"1c8bb9fa-3dd1-4913-897f-780c4183773d\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l" Mar 19 12:15:03.314239 master-0 kubenswrapper[17644]: I0319 12:15:03.314201 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c8bb9fa-3dd1-4913-897f-780c4183773d-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l\" (UID: \"1c8bb9fa-3dd1-4913-897f-780c4183773d\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l" Mar 19 12:15:03.314905 master-0 kubenswrapper[17644]: I0319 12:15:03.314588 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c8bb9fa-3dd1-4913-897f-780c4183773d-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l\" (UID: \"1c8bb9fa-3dd1-4913-897f-780c4183773d\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l" Mar 19 12:15:03.314905 master-0 kubenswrapper[17644]: I0319 12:15:03.314797 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c8bb9fa-3dd1-4913-897f-780c4183773d-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l\" (UID: \"1c8bb9fa-3dd1-4913-897f-780c4183773d\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l" Mar 19 12:15:03.340124 master-0 kubenswrapper[17644]: I0319 12:15:03.340006 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blk94\" (UniqueName: \"kubernetes.io/projected/1c8bb9fa-3dd1-4913-897f-780c4183773d-kube-api-access-blk94\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l\" (UID: \"1c8bb9fa-3dd1-4913-897f-780c4183773d\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l" Mar 19 12:15:03.626204 master-0 kubenswrapper[17644]: I0319 12:15:03.625995 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l" Mar 19 12:15:04.053530 master-0 kubenswrapper[17644]: I0319 12:15:04.053464 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l"] Mar 19 12:15:04.057308 master-0 kubenswrapper[17644]: W0319 12:15:04.057120 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c8bb9fa_3dd1_4913_897f_780c4183773d.slice/crio-1a674900239cafa2998a75c4dd8aa90ffbf9bdbd9efacccdbc34ae3abd2b3f62 WatchSource:0}: Error finding container 1a674900239cafa2998a75c4dd8aa90ffbf9bdbd9efacccdbc34ae3abd2b3f62: Status 404 returned error can't find the container with id 1a674900239cafa2998a75c4dd8aa90ffbf9bdbd9efacccdbc34ae3abd2b3f62 Mar 19 12:15:04.858290 master-0 kubenswrapper[17644]: I0319 12:15:04.858219 17644 generic.go:334] "Generic (PLEG): container finished" podID="1c8bb9fa-3dd1-4913-897f-780c4183773d" containerID="cc2287c3cdf5c471af4c7e8a74d73dca587dcf8c4a2e27453c521c07523fa6c7" exitCode=0 Mar 19 12:15:04.858565 master-0 kubenswrapper[17644]: I0319 12:15:04.858295 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l" event={"ID":"1c8bb9fa-3dd1-4913-897f-780c4183773d","Type":"ContainerDied","Data":"cc2287c3cdf5c471af4c7e8a74d73dca587dcf8c4a2e27453c521c07523fa6c7"} Mar 19 12:15:04.858565 master-0 kubenswrapper[17644]: I0319 12:15:04.858337 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l" event={"ID":"1c8bb9fa-3dd1-4913-897f-780c4183773d","Type":"ContainerStarted","Data":"1a674900239cafa2998a75c4dd8aa90ffbf9bdbd9efacccdbc34ae3abd2b3f62"} Mar 19 12:15:04.860244 master-0 kubenswrapper[17644]: I0319 12:15:04.860211 17644 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 12:15:09.912421 master-0 kubenswrapper[17644]: I0319 12:15:09.912317 17644 generic.go:334] "Generic (PLEG): container finished" podID="1c8bb9fa-3dd1-4913-897f-780c4183773d" containerID="92dd5a3e7271f2d210b3eb5c5ad7e89bf8c9fb75ca31c013be1e075b50c039a0" exitCode=0 Mar 19 12:15:09.912421 master-0 kubenswrapper[17644]: I0319 12:15:09.912413 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l" event={"ID":"1c8bb9fa-3dd1-4913-897f-780c4183773d","Type":"ContainerDied","Data":"92dd5a3e7271f2d210b3eb5c5ad7e89bf8c9fb75ca31c013be1e075b50c039a0"} Mar 19 12:15:10.927035 master-0 kubenswrapper[17644]: I0319 12:15:10.926904 17644 generic.go:334] "Generic (PLEG): container finished" podID="1c8bb9fa-3dd1-4913-897f-780c4183773d" containerID="ea12df054dd2738a0723c9a696ddcf28b3fb890b6a08fbe20d5a371a2e9a0740" exitCode=0 Mar 19 12:15:10.927979 master-0 kubenswrapper[17644]: I0319 12:15:10.927175 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l" event={"ID":"1c8bb9fa-3dd1-4913-897f-780c4183773d","Type":"ContainerDied","Data":"ea12df054dd2738a0723c9a696ddcf28b3fb890b6a08fbe20d5a371a2e9a0740"} Mar 19 12:15:12.297821 master-0 kubenswrapper[17644]: I0319 12:15:12.296461 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l" Mar 19 12:15:12.393799 master-0 kubenswrapper[17644]: I0319 12:15:12.393676 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blk94\" (UniqueName: \"kubernetes.io/projected/1c8bb9fa-3dd1-4913-897f-780c4183773d-kube-api-access-blk94\") pod \"1c8bb9fa-3dd1-4913-897f-780c4183773d\" (UID: \"1c8bb9fa-3dd1-4913-897f-780c4183773d\") " Mar 19 12:15:12.393799 master-0 kubenswrapper[17644]: I0319 12:15:12.393765 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c8bb9fa-3dd1-4913-897f-780c4183773d-util\") pod \"1c8bb9fa-3dd1-4913-897f-780c4183773d\" (UID: \"1c8bb9fa-3dd1-4913-897f-780c4183773d\") " Mar 19 12:15:12.394136 master-0 kubenswrapper[17644]: I0319 12:15:12.393823 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c8bb9fa-3dd1-4913-897f-780c4183773d-bundle\") pod \"1c8bb9fa-3dd1-4913-897f-780c4183773d\" (UID: \"1c8bb9fa-3dd1-4913-897f-780c4183773d\") " Mar 19 12:15:12.395093 master-0 kubenswrapper[17644]: I0319 12:15:12.394893 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c8bb9fa-3dd1-4913-897f-780c4183773d-bundle" (OuterVolumeSpecName: "bundle") pod "1c8bb9fa-3dd1-4913-897f-780c4183773d" (UID: "1c8bb9fa-3dd1-4913-897f-780c4183773d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:15:12.395828 master-0 kubenswrapper[17644]: I0319 12:15:12.395765 17644 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c8bb9fa-3dd1-4913-897f-780c4183773d-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:12.397283 master-0 kubenswrapper[17644]: I0319 12:15:12.397226 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c8bb9fa-3dd1-4913-897f-780c4183773d-kube-api-access-blk94" (OuterVolumeSpecName: "kube-api-access-blk94") pod "1c8bb9fa-3dd1-4913-897f-780c4183773d" (UID: "1c8bb9fa-3dd1-4913-897f-780c4183773d"). InnerVolumeSpecName "kube-api-access-blk94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:15:12.406397 master-0 kubenswrapper[17644]: I0319 12:15:12.406187 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c8bb9fa-3dd1-4913-897f-780c4183773d-util" (OuterVolumeSpecName: "util") pod "1c8bb9fa-3dd1-4913-897f-780c4183773d" (UID: "1c8bb9fa-3dd1-4913-897f-780c4183773d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:15:12.497652 master-0 kubenswrapper[17644]: I0319 12:15:12.497552 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blk94\" (UniqueName: \"kubernetes.io/projected/1c8bb9fa-3dd1-4913-897f-780c4183773d-kube-api-access-blk94\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:12.497652 master-0 kubenswrapper[17644]: I0319 12:15:12.497621 17644 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c8bb9fa-3dd1-4913-897f-780c4183773d-util\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:12.947378 master-0 kubenswrapper[17644]: I0319 12:15:12.947287 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l" event={"ID":"1c8bb9fa-3dd1-4913-897f-780c4183773d","Type":"ContainerDied","Data":"1a674900239cafa2998a75c4dd8aa90ffbf9bdbd9efacccdbc34ae3abd2b3f62"} Mar 19 12:15:12.947378 master-0 kubenswrapper[17644]: I0319 12:15:12.947343 17644 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a674900239cafa2998a75c4dd8aa90ffbf9bdbd9efacccdbc34ae3abd2b3f62" Mar 19 12:15:12.947997 master-0 kubenswrapper[17644]: I0319 12:15:12.947454 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4w484l" Mar 19 12:15:22.923291 master-0 kubenswrapper[17644]: I0319 12:15:22.923217 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/lvms-operator-7d575f666-kbbmh"] Mar 19 12:15:22.923929 master-0 kubenswrapper[17644]: E0319 12:15:22.923612 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c8bb9fa-3dd1-4913-897f-780c4183773d" containerName="pull" Mar 19 12:15:22.923929 master-0 kubenswrapper[17644]: I0319 12:15:22.923630 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c8bb9fa-3dd1-4913-897f-780c4183773d" containerName="pull" Mar 19 12:15:22.923929 master-0 kubenswrapper[17644]: E0319 12:15:22.923649 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c8bb9fa-3dd1-4913-897f-780c4183773d" containerName="util" Mar 19 12:15:22.923929 master-0 kubenswrapper[17644]: I0319 12:15:22.923658 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c8bb9fa-3dd1-4913-897f-780c4183773d" containerName="util" Mar 19 12:15:22.923929 master-0 kubenswrapper[17644]: E0319 12:15:22.923686 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c8bb9fa-3dd1-4913-897f-780c4183773d" containerName="extract" Mar 19 12:15:22.923929 master-0 kubenswrapper[17644]: I0319 12:15:22.923693 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c8bb9fa-3dd1-4913-897f-780c4183773d" containerName="extract" Mar 19 12:15:22.923929 master-0 kubenswrapper[17644]: I0319 12:15:22.923929 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c8bb9fa-3dd1-4913-897f-780c4183773d" containerName="extract" Mar 19 12:15:22.924608 master-0 kubenswrapper[17644]: I0319 12:15:22.924584 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-7d575f666-kbbmh" Mar 19 12:15:22.929449 master-0 kubenswrapper[17644]: I0319 12:15:22.929405 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-webhook-server-cert" Mar 19 12:15:22.929704 master-0 kubenswrapper[17644]: I0319 12:15:22.929674 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-metrics-cert" Mar 19 12:15:22.929704 master-0 kubenswrapper[17644]: I0319 12:15:22.929444 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"openshift-service-ca.crt" Mar 19 12:15:22.929845 master-0 kubenswrapper[17644]: I0319 12:15:22.929508 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"kube-root-ca.crt" Mar 19 12:15:22.929885 master-0 kubenswrapper[17644]: I0319 12:15:22.929609 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-service-cert" Mar 19 12:15:22.946569 master-0 kubenswrapper[17644]: I0319 12:15:22.946504 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-7d575f666-kbbmh"] Mar 19 12:15:23.074612 master-0 kubenswrapper[17644]: I0319 12:15:23.074498 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/130b56a7-a29e-4cc4-8464-bd5353d37d7a-webhook-cert\") pod \"lvms-operator-7d575f666-kbbmh\" (UID: \"130b56a7-a29e-4cc4-8464-bd5353d37d7a\") " pod="openshift-storage/lvms-operator-7d575f666-kbbmh" Mar 19 12:15:23.075256 master-0 kubenswrapper[17644]: I0319 12:15:23.075191 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/130b56a7-a29e-4cc4-8464-bd5353d37d7a-socket-dir\") pod \"lvms-operator-7d575f666-kbbmh\" (UID: \"130b56a7-a29e-4cc4-8464-bd5353d37d7a\") " pod="openshift-storage/lvms-operator-7d575f666-kbbmh" Mar 19 12:15:23.075315 master-0 kubenswrapper[17644]: I0319 12:15:23.075272 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/130b56a7-a29e-4cc4-8464-bd5353d37d7a-metrics-cert\") pod \"lvms-operator-7d575f666-kbbmh\" (UID: \"130b56a7-a29e-4cc4-8464-bd5353d37d7a\") " pod="openshift-storage/lvms-operator-7d575f666-kbbmh" Mar 19 12:15:23.075487 master-0 kubenswrapper[17644]: I0319 12:15:23.075458 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2vrj\" (UniqueName: \"kubernetes.io/projected/130b56a7-a29e-4cc4-8464-bd5353d37d7a-kube-api-access-n2vrj\") pod \"lvms-operator-7d575f666-kbbmh\" (UID: \"130b56a7-a29e-4cc4-8464-bd5353d37d7a\") " pod="openshift-storage/lvms-operator-7d575f666-kbbmh" Mar 19 12:15:23.075598 master-0 kubenswrapper[17644]: I0319 12:15:23.075576 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/130b56a7-a29e-4cc4-8464-bd5353d37d7a-apiservice-cert\") pod \"lvms-operator-7d575f666-kbbmh\" (UID: \"130b56a7-a29e-4cc4-8464-bd5353d37d7a\") " pod="openshift-storage/lvms-operator-7d575f666-kbbmh" Mar 19 12:15:23.177586 master-0 kubenswrapper[17644]: I0319 12:15:23.177443 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/130b56a7-a29e-4cc4-8464-bd5353d37d7a-apiservice-cert\") pod \"lvms-operator-7d575f666-kbbmh\" (UID: \"130b56a7-a29e-4cc4-8464-bd5353d37d7a\") " pod="openshift-storage/lvms-operator-7d575f666-kbbmh" Mar 19 12:15:23.177821 master-0 kubenswrapper[17644]: I0319 12:15:23.177694 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/130b56a7-a29e-4cc4-8464-bd5353d37d7a-webhook-cert\") pod \"lvms-operator-7d575f666-kbbmh\" (UID: \"130b56a7-a29e-4cc4-8464-bd5353d37d7a\") " pod="openshift-storage/lvms-operator-7d575f666-kbbmh" Mar 19 12:15:23.177886 master-0 kubenswrapper[17644]: I0319 12:15:23.177835 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/130b56a7-a29e-4cc4-8464-bd5353d37d7a-socket-dir\") pod \"lvms-operator-7d575f666-kbbmh\" (UID: \"130b56a7-a29e-4cc4-8464-bd5353d37d7a\") " pod="openshift-storage/lvms-operator-7d575f666-kbbmh" Mar 19 12:15:23.177886 master-0 kubenswrapper[17644]: I0319 12:15:23.177859 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/130b56a7-a29e-4cc4-8464-bd5353d37d7a-metrics-cert\") pod \"lvms-operator-7d575f666-kbbmh\" (UID: \"130b56a7-a29e-4cc4-8464-bd5353d37d7a\") " pod="openshift-storage/lvms-operator-7d575f666-kbbmh" Mar 19 12:15:23.178022 master-0 kubenswrapper[17644]: I0319 12:15:23.177990 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2vrj\" (UniqueName: \"kubernetes.io/projected/130b56a7-a29e-4cc4-8464-bd5353d37d7a-kube-api-access-n2vrj\") pod \"lvms-operator-7d575f666-kbbmh\" (UID: \"130b56a7-a29e-4cc4-8464-bd5353d37d7a\") " pod="openshift-storage/lvms-operator-7d575f666-kbbmh" Mar 19 12:15:23.178651 master-0 kubenswrapper[17644]: I0319 12:15:23.178589 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/130b56a7-a29e-4cc4-8464-bd5353d37d7a-socket-dir\") pod \"lvms-operator-7d575f666-kbbmh\" (UID: \"130b56a7-a29e-4cc4-8464-bd5353d37d7a\") " pod="openshift-storage/lvms-operator-7d575f666-kbbmh" Mar 19 12:15:23.181426 master-0 kubenswrapper[17644]: I0319 12:15:23.181182 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/130b56a7-a29e-4cc4-8464-bd5353d37d7a-metrics-cert\") pod \"lvms-operator-7d575f666-kbbmh\" (UID: \"130b56a7-a29e-4cc4-8464-bd5353d37d7a\") " pod="openshift-storage/lvms-operator-7d575f666-kbbmh" Mar 19 12:15:23.192080 master-0 kubenswrapper[17644]: I0319 12:15:23.192011 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/130b56a7-a29e-4cc4-8464-bd5353d37d7a-apiservice-cert\") pod \"lvms-operator-7d575f666-kbbmh\" (UID: \"130b56a7-a29e-4cc4-8464-bd5353d37d7a\") " pod="openshift-storage/lvms-operator-7d575f666-kbbmh" Mar 19 12:15:23.192995 master-0 kubenswrapper[17644]: I0319 12:15:23.192934 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/130b56a7-a29e-4cc4-8464-bd5353d37d7a-webhook-cert\") pod \"lvms-operator-7d575f666-kbbmh\" (UID: \"130b56a7-a29e-4cc4-8464-bd5353d37d7a\") " pod="openshift-storage/lvms-operator-7d575f666-kbbmh" Mar 19 12:15:23.199610 master-0 kubenswrapper[17644]: I0319 12:15:23.199532 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2vrj\" (UniqueName: \"kubernetes.io/projected/130b56a7-a29e-4cc4-8464-bd5353d37d7a-kube-api-access-n2vrj\") pod \"lvms-operator-7d575f666-kbbmh\" (UID: \"130b56a7-a29e-4cc4-8464-bd5353d37d7a\") " pod="openshift-storage/lvms-operator-7d575f666-kbbmh" Mar 19 12:15:23.242031 master-0 kubenswrapper[17644]: I0319 12:15:23.241974 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-7d575f666-kbbmh" Mar 19 12:15:23.693071 master-0 kubenswrapper[17644]: I0319 12:15:23.693010 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-7d575f666-kbbmh"] Mar 19 12:15:24.021817 master-0 kubenswrapper[17644]: I0319 12:15:24.021632 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-7d575f666-kbbmh" event={"ID":"130b56a7-a29e-4cc4-8464-bd5353d37d7a","Type":"ContainerStarted","Data":"ec41c59aab07a47bb7eab52b15b87486b43caa0c7cda620f0fb48944ea9d7397"} Mar 19 12:15:29.069163 master-0 kubenswrapper[17644]: I0319 12:15:29.069069 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-7d575f666-kbbmh" event={"ID":"130b56a7-a29e-4cc4-8464-bd5353d37d7a","Type":"ContainerStarted","Data":"bc149bba01655cd20f37ee3c1da33ff61e76377f8310cba4930cf605e050812b"} Mar 19 12:15:29.069750 master-0 kubenswrapper[17644]: I0319 12:15:29.069389 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/lvms-operator-7d575f666-kbbmh" Mar 19 12:15:29.072565 master-0 kubenswrapper[17644]: I0319 12:15:29.072526 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/lvms-operator-7d575f666-kbbmh" Mar 19 12:15:29.097845 master-0 kubenswrapper[17644]: I0319 12:15:29.095694 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/lvms-operator-7d575f666-kbbmh" podStartSLOduration=2.404602596 podStartE2EDuration="7.095671821s" podCreationTimestamp="2026-03-19 12:15:22 +0000 UTC" firstStartedPulling="2026-03-19 12:15:23.698305128 +0000 UTC m=+957.468263163" lastFinishedPulling="2026-03-19 12:15:28.389374353 +0000 UTC m=+962.159332388" observedRunningTime="2026-03-19 12:15:29.094586434 +0000 UTC m=+962.864544489" watchObservedRunningTime="2026-03-19 12:15:29.095671821 +0000 UTC m=+962.865629856" Mar 19 12:15:33.003889 master-0 kubenswrapper[17644]: I0319 12:15:33.003804 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff"] Mar 19 12:15:33.012898 master-0 kubenswrapper[17644]: I0319 12:15:33.012814 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff" Mar 19 12:15:33.078033 master-0 kubenswrapper[17644]: I0319 12:15:33.077984 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff"] Mar 19 12:15:33.160675 master-0 kubenswrapper[17644]: I0319 12:15:33.160587 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/157911ad-1a9e-46fa-bf3e-bd268714ea05-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff\" (UID: \"157911ad-1a9e-46fa-bf3e-bd268714ea05\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff" Mar 19 12:15:33.160983 master-0 kubenswrapper[17644]: I0319 12:15:33.160831 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/157911ad-1a9e-46fa-bf3e-bd268714ea05-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff\" (UID: \"157911ad-1a9e-46fa-bf3e-bd268714ea05\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff" Mar 19 12:15:33.160983 master-0 kubenswrapper[17644]: I0319 12:15:33.160949 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s25f7\" (UniqueName: \"kubernetes.io/projected/157911ad-1a9e-46fa-bf3e-bd268714ea05-kube-api-access-s25f7\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff\" (UID: \"157911ad-1a9e-46fa-bf3e-bd268714ea05\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff" Mar 19 12:15:33.264923 master-0 kubenswrapper[17644]: I0319 12:15:33.264767 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/157911ad-1a9e-46fa-bf3e-bd268714ea05-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff\" (UID: \"157911ad-1a9e-46fa-bf3e-bd268714ea05\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff" Mar 19 12:15:33.265309 master-0 kubenswrapper[17644]: I0319 12:15:33.265274 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s25f7\" (UniqueName: \"kubernetes.io/projected/157911ad-1a9e-46fa-bf3e-bd268714ea05-kube-api-access-s25f7\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff\" (UID: \"157911ad-1a9e-46fa-bf3e-bd268714ea05\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff" Mar 19 12:15:33.265572 master-0 kubenswrapper[17644]: I0319 12:15:33.265537 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/157911ad-1a9e-46fa-bf3e-bd268714ea05-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff\" (UID: \"157911ad-1a9e-46fa-bf3e-bd268714ea05\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff" Mar 19 12:15:33.266221 master-0 kubenswrapper[17644]: I0319 12:15:33.266164 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/157911ad-1a9e-46fa-bf3e-bd268714ea05-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff\" (UID: \"157911ad-1a9e-46fa-bf3e-bd268714ea05\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff" Mar 19 12:15:33.266322 master-0 kubenswrapper[17644]: I0319 12:15:33.266222 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/157911ad-1a9e-46fa-bf3e-bd268714ea05-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff\" (UID: \"157911ad-1a9e-46fa-bf3e-bd268714ea05\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff" Mar 19 12:15:33.291828 master-0 kubenswrapper[17644]: I0319 12:15:33.291759 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s25f7\" (UniqueName: \"kubernetes.io/projected/157911ad-1a9e-46fa-bf3e-bd268714ea05-kube-api-access-s25f7\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff\" (UID: \"157911ad-1a9e-46fa-bf3e-bd268714ea05\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff" Mar 19 12:15:33.332610 master-0 kubenswrapper[17644]: I0319 12:15:33.332533 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff" Mar 19 12:15:33.777069 master-0 kubenswrapper[17644]: I0319 12:15:33.777007 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff"] Mar 19 12:15:33.784504 master-0 kubenswrapper[17644]: W0319 12:15:33.784473 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod157911ad_1a9e_46fa_bf3e_bd268714ea05.slice/crio-b5902978403ced3d282b8bb42f9141631dc7407d63d31ca4b8d25f3611845cab WatchSource:0}: Error finding container b5902978403ced3d282b8bb42f9141631dc7407d63d31ca4b8d25f3611845cab: Status 404 returned error can't find the container with id b5902978403ced3d282b8bb42f9141631dc7407d63d31ca4b8d25f3611845cab Mar 19 12:15:34.010676 master-0 kubenswrapper[17644]: I0319 12:15:34.010598 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc"] Mar 19 12:15:34.012451 master-0 kubenswrapper[17644]: I0319 12:15:34.012415 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc" Mar 19 12:15:34.027455 master-0 kubenswrapper[17644]: I0319 12:15:34.027364 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc"] Mar 19 12:15:34.119272 master-0 kubenswrapper[17644]: I0319 12:15:34.119204 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff" event={"ID":"157911ad-1a9e-46fa-bf3e-bd268714ea05","Type":"ContainerStarted","Data":"b71d998fd17b5225c9b203b995c4f763696851352f8246599fbf4622b9c4a2de"} Mar 19 12:15:34.119272 master-0 kubenswrapper[17644]: I0319 12:15:34.119272 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff" event={"ID":"157911ad-1a9e-46fa-bf3e-bd268714ea05","Type":"ContainerStarted","Data":"b5902978403ced3d282b8bb42f9141631dc7407d63d31ca4b8d25f3611845cab"} Mar 19 12:15:34.180638 master-0 kubenswrapper[17644]: I0319 12:15:34.180513 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/45342c9f-825e-4171-a078-3b175362c618-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc\" (UID: \"45342c9f-825e-4171-a078-3b175362c618\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc" Mar 19 12:15:34.180638 master-0 kubenswrapper[17644]: I0319 12:15:34.180575 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/45342c9f-825e-4171-a078-3b175362c618-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc\" (UID: \"45342c9f-825e-4171-a078-3b175362c618\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc" Mar 19 12:15:34.181010 master-0 kubenswrapper[17644]: I0319 12:15:34.180865 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdplg\" (UniqueName: \"kubernetes.io/projected/45342c9f-825e-4171-a078-3b175362c618-kube-api-access-wdplg\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc\" (UID: \"45342c9f-825e-4171-a078-3b175362c618\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc" Mar 19 12:15:34.283078 master-0 kubenswrapper[17644]: I0319 12:15:34.282908 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/45342c9f-825e-4171-a078-3b175362c618-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc\" (UID: \"45342c9f-825e-4171-a078-3b175362c618\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc" Mar 19 12:15:34.283078 master-0 kubenswrapper[17644]: I0319 12:15:34.282983 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/45342c9f-825e-4171-a078-3b175362c618-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc\" (UID: \"45342c9f-825e-4171-a078-3b175362c618\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc" Mar 19 12:15:34.283309 master-0 kubenswrapper[17644]: I0319 12:15:34.283186 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdplg\" (UniqueName: \"kubernetes.io/projected/45342c9f-825e-4171-a078-3b175362c618-kube-api-access-wdplg\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc\" (UID: \"45342c9f-825e-4171-a078-3b175362c618\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc" Mar 19 12:15:34.283679 master-0 kubenswrapper[17644]: I0319 12:15:34.283608 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/45342c9f-825e-4171-a078-3b175362c618-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc\" (UID: \"45342c9f-825e-4171-a078-3b175362c618\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc" Mar 19 12:15:34.283679 master-0 kubenswrapper[17644]: I0319 12:15:34.283637 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/45342c9f-825e-4171-a078-3b175362c618-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc\" (UID: \"45342c9f-825e-4171-a078-3b175362c618\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc" Mar 19 12:15:34.363506 master-0 kubenswrapper[17644]: I0319 12:15:34.363446 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdplg\" (UniqueName: \"kubernetes.io/projected/45342c9f-825e-4171-a078-3b175362c618-kube-api-access-wdplg\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc\" (UID: \"45342c9f-825e-4171-a078-3b175362c618\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc" Mar 19 12:15:34.642883 master-0 kubenswrapper[17644]: I0319 12:15:34.642678 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc" Mar 19 12:15:34.810678 master-0 kubenswrapper[17644]: I0319 12:15:34.810620 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6"] Mar 19 12:15:34.812255 master-0 kubenswrapper[17644]: I0319 12:15:34.812197 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6" Mar 19 12:15:34.835421 master-0 kubenswrapper[17644]: I0319 12:15:34.835367 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6"] Mar 19 12:15:34.905876 master-0 kubenswrapper[17644]: I0319 12:15:34.905650 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqz5k\" (UniqueName: \"kubernetes.io/projected/3874aa74-b4b4-48e8-ab72-e1b5a7f6f969-kube-api-access-xqz5k\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6\" (UID: \"3874aa74-b4b4-48e8-ab72-e1b5a7f6f969\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6" Mar 19 12:15:34.905876 master-0 kubenswrapper[17644]: I0319 12:15:34.905796 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3874aa74-b4b4-48e8-ab72-e1b5a7f6f969-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6\" (UID: \"3874aa74-b4b4-48e8-ab72-e1b5a7f6f969\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6" Mar 19 12:15:34.905876 master-0 kubenswrapper[17644]: I0319 12:15:34.905857 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3874aa74-b4b4-48e8-ab72-e1b5a7f6f969-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6\" (UID: \"3874aa74-b4b4-48e8-ab72-e1b5a7f6f969\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6" Mar 19 12:15:35.007368 master-0 kubenswrapper[17644]: I0319 12:15:35.007313 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqz5k\" (UniqueName: \"kubernetes.io/projected/3874aa74-b4b4-48e8-ab72-e1b5a7f6f969-kube-api-access-xqz5k\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6\" (UID: \"3874aa74-b4b4-48e8-ab72-e1b5a7f6f969\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6" Mar 19 12:15:35.007682 master-0 kubenswrapper[17644]: I0319 12:15:35.007660 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3874aa74-b4b4-48e8-ab72-e1b5a7f6f969-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6\" (UID: \"3874aa74-b4b4-48e8-ab72-e1b5a7f6f969\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6" Mar 19 12:15:35.007844 master-0 kubenswrapper[17644]: I0319 12:15:35.007821 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3874aa74-b4b4-48e8-ab72-e1b5a7f6f969-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6\" (UID: \"3874aa74-b4b4-48e8-ab72-e1b5a7f6f969\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6" Mar 19 12:15:35.008468 master-0 kubenswrapper[17644]: I0319 12:15:35.008402 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3874aa74-b4b4-48e8-ab72-e1b5a7f6f969-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6\" (UID: \"3874aa74-b4b4-48e8-ab72-e1b5a7f6f969\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6" Mar 19 12:15:35.008468 master-0 kubenswrapper[17644]: I0319 12:15:35.008402 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3874aa74-b4b4-48e8-ab72-e1b5a7f6f969-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6\" (UID: \"3874aa74-b4b4-48e8-ab72-e1b5a7f6f969\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6" Mar 19 12:15:35.021977 master-0 kubenswrapper[17644]: I0319 12:15:35.021939 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqz5k\" (UniqueName: \"kubernetes.io/projected/3874aa74-b4b4-48e8-ab72-e1b5a7f6f969-kube-api-access-xqz5k\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6\" (UID: \"3874aa74-b4b4-48e8-ab72-e1b5a7f6f969\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6" Mar 19 12:15:35.069888 master-0 kubenswrapper[17644]: I0319 12:15:35.069826 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc"] Mar 19 12:15:35.128511 master-0 kubenswrapper[17644]: I0319 12:15:35.128432 17644 generic.go:334] "Generic (PLEG): container finished" podID="157911ad-1a9e-46fa-bf3e-bd268714ea05" containerID="b71d998fd17b5225c9b203b995c4f763696851352f8246599fbf4622b9c4a2de" exitCode=0 Mar 19 12:15:35.128687 master-0 kubenswrapper[17644]: I0319 12:15:35.128575 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6" Mar 19 12:15:35.128992 master-0 kubenswrapper[17644]: I0319 12:15:35.128879 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff" event={"ID":"157911ad-1a9e-46fa-bf3e-bd268714ea05","Type":"ContainerDied","Data":"b71d998fd17b5225c9b203b995c4f763696851352f8246599fbf4622b9c4a2de"} Mar 19 12:15:35.133410 master-0 kubenswrapper[17644]: I0319 12:15:35.133343 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc" event={"ID":"45342c9f-825e-4171-a078-3b175362c618","Type":"ContainerStarted","Data":"5f449fb4ad4a8f55c75c6c3fcd8a173f715941678498837b596529d90ab4ac9e"} Mar 19 12:15:35.534637 master-0 kubenswrapper[17644]: W0319 12:15:35.534596 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3874aa74_b4b4_48e8_ab72_e1b5a7f6f969.slice/crio-90598356b8ec2271daef2b3147ba148b8db301d55768cd2b193a4c111641369a WatchSource:0}: Error finding container 90598356b8ec2271daef2b3147ba148b8db301d55768cd2b193a4c111641369a: Status 404 returned error can't find the container with id 90598356b8ec2271daef2b3147ba148b8db301d55768cd2b193a4c111641369a Mar 19 12:15:35.536985 master-0 kubenswrapper[17644]: I0319 12:15:35.536900 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6"] Mar 19 12:15:36.141537 master-0 kubenswrapper[17644]: I0319 12:15:36.141430 17644 generic.go:334] "Generic (PLEG): container finished" podID="3874aa74-b4b4-48e8-ab72-e1b5a7f6f969" containerID="8d461ff8ef6012882d50b85fef13bb86dc7412842d5b601893b6fb6b5522a405" exitCode=0 Mar 19 12:15:36.142112 master-0 kubenswrapper[17644]: I0319 12:15:36.141542 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6" event={"ID":"3874aa74-b4b4-48e8-ab72-e1b5a7f6f969","Type":"ContainerDied","Data":"8d461ff8ef6012882d50b85fef13bb86dc7412842d5b601893b6fb6b5522a405"} Mar 19 12:15:36.142112 master-0 kubenswrapper[17644]: I0319 12:15:36.141621 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6" event={"ID":"3874aa74-b4b4-48e8-ab72-e1b5a7f6f969","Type":"ContainerStarted","Data":"90598356b8ec2271daef2b3147ba148b8db301d55768cd2b193a4c111641369a"} Mar 19 12:15:36.144066 master-0 kubenswrapper[17644]: I0319 12:15:36.144038 17644 generic.go:334] "Generic (PLEG): container finished" podID="45342c9f-825e-4171-a078-3b175362c618" containerID="2d0d128c5c7530effa234f4a7a541266a2cb81752c920487d29a1b961920cd48" exitCode=0 Mar 19 12:15:36.144125 master-0 kubenswrapper[17644]: I0319 12:15:36.144075 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc" event={"ID":"45342c9f-825e-4171-a078-3b175362c618","Type":"ContainerDied","Data":"2d0d128c5c7530effa234f4a7a541266a2cb81752c920487d29a1b961920cd48"} Mar 19 12:15:40.185693 master-0 kubenswrapper[17644]: I0319 12:15:40.185603 17644 generic.go:334] "Generic (PLEG): container finished" podID="157911ad-1a9e-46fa-bf3e-bd268714ea05" containerID="163d630ba74d6e76dab6a2ec86ea10bd8c73b99e988fa576daa40073297820c1" exitCode=0 Mar 19 12:15:40.186903 master-0 kubenswrapper[17644]: I0319 12:15:40.186870 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff" event={"ID":"157911ad-1a9e-46fa-bf3e-bd268714ea05","Type":"ContainerDied","Data":"163d630ba74d6e76dab6a2ec86ea10bd8c73b99e988fa576daa40073297820c1"} Mar 19 12:15:40.191022 master-0 kubenswrapper[17644]: I0319 12:15:40.190974 17644 generic.go:334] "Generic (PLEG): container finished" podID="3874aa74-b4b4-48e8-ab72-e1b5a7f6f969" containerID="77af25832d89351bcf5cbe38739d8275fabd8f6fa004ca069ee2fe4dd6d2921f" exitCode=0 Mar 19 12:15:40.191168 master-0 kubenswrapper[17644]: I0319 12:15:40.191083 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6" event={"ID":"3874aa74-b4b4-48e8-ab72-e1b5a7f6f969","Type":"ContainerDied","Data":"77af25832d89351bcf5cbe38739d8275fabd8f6fa004ca069ee2fe4dd6d2921f"} Mar 19 12:15:40.195190 master-0 kubenswrapper[17644]: I0319 12:15:40.195146 17644 generic.go:334] "Generic (PLEG): container finished" podID="45342c9f-825e-4171-a078-3b175362c618" containerID="ba17aa1314d61ff94ad349151a5ad639932974cb6e0ada3a91ee29ff491b3438" exitCode=0 Mar 19 12:15:40.195288 master-0 kubenswrapper[17644]: I0319 12:15:40.195202 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc" event={"ID":"45342c9f-825e-4171-a078-3b175362c618","Type":"ContainerDied","Data":"ba17aa1314d61ff94ad349151a5ad639932974cb6e0ada3a91ee29ff491b3438"} Mar 19 12:15:41.206303 master-0 kubenswrapper[17644]: I0319 12:15:41.206224 17644 generic.go:334] "Generic (PLEG): container finished" podID="157911ad-1a9e-46fa-bf3e-bd268714ea05" containerID="8dbe6742c8cb98f8e64a49a8eaaaccc84c8e25a0575e09d8a81af7abf6942430" exitCode=0 Mar 19 12:15:41.206894 master-0 kubenswrapper[17644]: I0319 12:15:41.206323 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff" event={"ID":"157911ad-1a9e-46fa-bf3e-bd268714ea05","Type":"ContainerDied","Data":"8dbe6742c8cb98f8e64a49a8eaaaccc84c8e25a0575e09d8a81af7abf6942430"} Mar 19 12:15:41.209244 master-0 kubenswrapper[17644]: I0319 12:15:41.209208 17644 generic.go:334] "Generic (PLEG): container finished" podID="3874aa74-b4b4-48e8-ab72-e1b5a7f6f969" containerID="fb86a2e57cc044ca1a6f11c426352eb73bea7976de8c293c5dd8900e8c2f1aa3" exitCode=0 Mar 19 12:15:41.209370 master-0 kubenswrapper[17644]: I0319 12:15:41.209288 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6" event={"ID":"3874aa74-b4b4-48e8-ab72-e1b5a7f6f969","Type":"ContainerDied","Data":"fb86a2e57cc044ca1a6f11c426352eb73bea7976de8c293c5dd8900e8c2f1aa3"} Mar 19 12:15:41.212466 master-0 kubenswrapper[17644]: I0319 12:15:41.212421 17644 generic.go:334] "Generic (PLEG): container finished" podID="45342c9f-825e-4171-a078-3b175362c618" containerID="68bc098196aa1142df586f9fe110223fa18258d554a37869c59587dd563218d6" exitCode=0 Mar 19 12:15:41.212466 master-0 kubenswrapper[17644]: I0319 12:15:41.212469 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc" event={"ID":"45342c9f-825e-4171-a078-3b175362c618","Type":"ContainerDied","Data":"68bc098196aa1142df586f9fe110223fa18258d554a37869c59587dd563218d6"} Mar 19 12:15:41.407030 master-0 kubenswrapper[17644]: I0319 12:15:41.406973 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr"] Mar 19 12:15:41.408354 master-0 kubenswrapper[17644]: I0319 12:15:41.408327 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr" Mar 19 12:15:41.429495 master-0 kubenswrapper[17644]: I0319 12:15:41.429436 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr"] Mar 19 12:15:41.519791 master-0 kubenswrapper[17644]: I0319 12:15:41.519641 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdfw9\" (UniqueName: \"kubernetes.io/projected/28a864cc-db2e-4316-ad0f-1cd948ba2214-kube-api-access-rdfw9\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr\" (UID: \"28a864cc-db2e-4316-ad0f-1cd948ba2214\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr" Mar 19 12:15:41.520001 master-0 kubenswrapper[17644]: I0319 12:15:41.519898 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28a864cc-db2e-4316-ad0f-1cd948ba2214-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr\" (UID: \"28a864cc-db2e-4316-ad0f-1cd948ba2214\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr" Mar 19 12:15:41.521320 master-0 kubenswrapper[17644]: I0319 12:15:41.520394 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28a864cc-db2e-4316-ad0f-1cd948ba2214-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr\" (UID: \"28a864cc-db2e-4316-ad0f-1cd948ba2214\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr" Mar 19 12:15:41.622249 master-0 kubenswrapper[17644]: I0319 12:15:41.622165 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28a864cc-db2e-4316-ad0f-1cd948ba2214-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr\" (UID: \"28a864cc-db2e-4316-ad0f-1cd948ba2214\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr" Mar 19 12:15:41.622249 master-0 kubenswrapper[17644]: I0319 12:15:41.622234 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdfw9\" (UniqueName: \"kubernetes.io/projected/28a864cc-db2e-4316-ad0f-1cd948ba2214-kube-api-access-rdfw9\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr\" (UID: \"28a864cc-db2e-4316-ad0f-1cd948ba2214\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr" Mar 19 12:15:41.622249 master-0 kubenswrapper[17644]: I0319 12:15:41.622283 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28a864cc-db2e-4316-ad0f-1cd948ba2214-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr\" (UID: \"28a864cc-db2e-4316-ad0f-1cd948ba2214\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr" Mar 19 12:15:41.622893 master-0 kubenswrapper[17644]: I0319 12:15:41.622784 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28a864cc-db2e-4316-ad0f-1cd948ba2214-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr\" (UID: \"28a864cc-db2e-4316-ad0f-1cd948ba2214\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr" Mar 19 12:15:41.622893 master-0 kubenswrapper[17644]: I0319 12:15:41.622788 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28a864cc-db2e-4316-ad0f-1cd948ba2214-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr\" (UID: \"28a864cc-db2e-4316-ad0f-1cd948ba2214\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr" Mar 19 12:15:41.638502 master-0 kubenswrapper[17644]: I0319 12:15:41.638427 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdfw9\" (UniqueName: \"kubernetes.io/projected/28a864cc-db2e-4316-ad0f-1cd948ba2214-kube-api-access-rdfw9\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr\" (UID: \"28a864cc-db2e-4316-ad0f-1cd948ba2214\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr" Mar 19 12:15:41.722626 master-0 kubenswrapper[17644]: I0319 12:15:41.722564 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr" Mar 19 12:15:42.120186 master-0 kubenswrapper[17644]: W0319 12:15:42.119954 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28a864cc_db2e_4316_ad0f_1cd948ba2214.slice/crio-415a60c5e6474c455bb1428308858267da49bb684dfc87b8dabb4823b2fe811b WatchSource:0}: Error finding container 415a60c5e6474c455bb1428308858267da49bb684dfc87b8dabb4823b2fe811b: Status 404 returned error can't find the container with id 415a60c5e6474c455bb1428308858267da49bb684dfc87b8dabb4823b2fe811b Mar 19 12:15:42.128872 master-0 kubenswrapper[17644]: I0319 12:15:42.128213 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr"] Mar 19 12:15:42.221441 master-0 kubenswrapper[17644]: I0319 12:15:42.221357 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr" event={"ID":"28a864cc-db2e-4316-ad0f-1cd948ba2214","Type":"ContainerStarted","Data":"415a60c5e6474c455bb1428308858267da49bb684dfc87b8dabb4823b2fe811b"} Mar 19 12:15:42.712757 master-0 kubenswrapper[17644]: I0319 12:15:42.712380 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc" Mar 19 12:15:42.729858 master-0 kubenswrapper[17644]: I0319 12:15:42.728916 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6" Mar 19 12:15:42.771055 master-0 kubenswrapper[17644]: I0319 12:15:42.771007 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff" Mar 19 12:15:42.849096 master-0 kubenswrapper[17644]: I0319 12:15:42.849031 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/45342c9f-825e-4171-a078-3b175362c618-util\") pod \"45342c9f-825e-4171-a078-3b175362c618\" (UID: \"45342c9f-825e-4171-a078-3b175362c618\") " Mar 19 12:15:42.849096 master-0 kubenswrapper[17644]: I0319 12:15:42.849105 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3874aa74-b4b4-48e8-ab72-e1b5a7f6f969-util\") pod \"3874aa74-b4b4-48e8-ab72-e1b5a7f6f969\" (UID: \"3874aa74-b4b4-48e8-ab72-e1b5a7f6f969\") " Mar 19 12:15:42.849385 master-0 kubenswrapper[17644]: I0319 12:15:42.849292 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3874aa74-b4b4-48e8-ab72-e1b5a7f6f969-bundle\") pod \"3874aa74-b4b4-48e8-ab72-e1b5a7f6f969\" (UID: \"3874aa74-b4b4-48e8-ab72-e1b5a7f6f969\") " Mar 19 12:15:42.849385 master-0 kubenswrapper[17644]: I0319 12:15:42.849362 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdplg\" (UniqueName: \"kubernetes.io/projected/45342c9f-825e-4171-a078-3b175362c618-kube-api-access-wdplg\") pod \"45342c9f-825e-4171-a078-3b175362c618\" (UID: \"45342c9f-825e-4171-a078-3b175362c618\") " Mar 19 12:15:42.849470 master-0 kubenswrapper[17644]: I0319 12:15:42.849450 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/157911ad-1a9e-46fa-bf3e-bd268714ea05-bundle\") pod \"157911ad-1a9e-46fa-bf3e-bd268714ea05\" (UID: \"157911ad-1a9e-46fa-bf3e-bd268714ea05\") " Mar 19 12:15:42.849532 master-0 kubenswrapper[17644]: I0319 12:15:42.849507 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s25f7\" (UniqueName: \"kubernetes.io/projected/157911ad-1a9e-46fa-bf3e-bd268714ea05-kube-api-access-s25f7\") pod \"157911ad-1a9e-46fa-bf3e-bd268714ea05\" (UID: \"157911ad-1a9e-46fa-bf3e-bd268714ea05\") " Mar 19 12:15:42.849574 master-0 kubenswrapper[17644]: I0319 12:15:42.849550 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqz5k\" (UniqueName: \"kubernetes.io/projected/3874aa74-b4b4-48e8-ab72-e1b5a7f6f969-kube-api-access-xqz5k\") pod \"3874aa74-b4b4-48e8-ab72-e1b5a7f6f969\" (UID: \"3874aa74-b4b4-48e8-ab72-e1b5a7f6f969\") " Mar 19 12:15:42.849613 master-0 kubenswrapper[17644]: I0319 12:15:42.849589 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/45342c9f-825e-4171-a078-3b175362c618-bundle\") pod \"45342c9f-825e-4171-a078-3b175362c618\" (UID: \"45342c9f-825e-4171-a078-3b175362c618\") " Mar 19 12:15:42.849650 master-0 kubenswrapper[17644]: I0319 12:15:42.849618 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/157911ad-1a9e-46fa-bf3e-bd268714ea05-util\") pod \"157911ad-1a9e-46fa-bf3e-bd268714ea05\" (UID: \"157911ad-1a9e-46fa-bf3e-bd268714ea05\") " Mar 19 12:15:42.851429 master-0 kubenswrapper[17644]: I0319 12:15:42.851361 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45342c9f-825e-4171-a078-3b175362c618-bundle" (OuterVolumeSpecName: "bundle") pod "45342c9f-825e-4171-a078-3b175362c618" (UID: "45342c9f-825e-4171-a078-3b175362c618"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:15:42.852251 master-0 kubenswrapper[17644]: I0319 12:15:42.852173 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/157911ad-1a9e-46fa-bf3e-bd268714ea05-bundle" (OuterVolumeSpecName: "bundle") pod "157911ad-1a9e-46fa-bf3e-bd268714ea05" (UID: "157911ad-1a9e-46fa-bf3e-bd268714ea05"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:15:42.853722 master-0 kubenswrapper[17644]: I0319 12:15:42.853020 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3874aa74-b4b4-48e8-ab72-e1b5a7f6f969-bundle" (OuterVolumeSpecName: "bundle") pod "3874aa74-b4b4-48e8-ab72-e1b5a7f6f969" (UID: "3874aa74-b4b4-48e8-ab72-e1b5a7f6f969"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:15:42.853722 master-0 kubenswrapper[17644]: I0319 12:15:42.853644 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3874aa74-b4b4-48e8-ab72-e1b5a7f6f969-kube-api-access-xqz5k" (OuterVolumeSpecName: "kube-api-access-xqz5k") pod "3874aa74-b4b4-48e8-ab72-e1b5a7f6f969" (UID: "3874aa74-b4b4-48e8-ab72-e1b5a7f6f969"). InnerVolumeSpecName "kube-api-access-xqz5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:15:42.854169 master-0 kubenswrapper[17644]: I0319 12:15:42.854087 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45342c9f-825e-4171-a078-3b175362c618-kube-api-access-wdplg" (OuterVolumeSpecName: "kube-api-access-wdplg") pod "45342c9f-825e-4171-a078-3b175362c618" (UID: "45342c9f-825e-4171-a078-3b175362c618"). InnerVolumeSpecName "kube-api-access-wdplg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:15:42.854513 master-0 kubenswrapper[17644]: I0319 12:15:42.854442 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/157911ad-1a9e-46fa-bf3e-bd268714ea05-kube-api-access-s25f7" (OuterVolumeSpecName: "kube-api-access-s25f7") pod "157911ad-1a9e-46fa-bf3e-bd268714ea05" (UID: "157911ad-1a9e-46fa-bf3e-bd268714ea05"). InnerVolumeSpecName "kube-api-access-s25f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:15:42.862377 master-0 kubenswrapper[17644]: I0319 12:15:42.862247 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/157911ad-1a9e-46fa-bf3e-bd268714ea05-util" (OuterVolumeSpecName: "util") pod "157911ad-1a9e-46fa-bf3e-bd268714ea05" (UID: "157911ad-1a9e-46fa-bf3e-bd268714ea05"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:15:42.865316 master-0 kubenswrapper[17644]: I0319 12:15:42.865245 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3874aa74-b4b4-48e8-ab72-e1b5a7f6f969-util" (OuterVolumeSpecName: "util") pod "3874aa74-b4b4-48e8-ab72-e1b5a7f6f969" (UID: "3874aa74-b4b4-48e8-ab72-e1b5a7f6f969"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:15:42.875942 master-0 kubenswrapper[17644]: I0319 12:15:42.875861 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45342c9f-825e-4171-a078-3b175362c618-util" (OuterVolumeSpecName: "util") pod "45342c9f-825e-4171-a078-3b175362c618" (UID: "45342c9f-825e-4171-a078-3b175362c618"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:15:42.951957 master-0 kubenswrapper[17644]: I0319 12:15:42.951803 17644 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3874aa74-b4b4-48e8-ab72-e1b5a7f6f969-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:42.951957 master-0 kubenswrapper[17644]: I0319 12:15:42.951861 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdplg\" (UniqueName: \"kubernetes.io/projected/45342c9f-825e-4171-a078-3b175362c618-kube-api-access-wdplg\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:42.951957 master-0 kubenswrapper[17644]: I0319 12:15:42.951876 17644 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/157911ad-1a9e-46fa-bf3e-bd268714ea05-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:42.951957 master-0 kubenswrapper[17644]: I0319 12:15:42.951886 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s25f7\" (UniqueName: \"kubernetes.io/projected/157911ad-1a9e-46fa-bf3e-bd268714ea05-kube-api-access-s25f7\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:42.951957 master-0 kubenswrapper[17644]: I0319 12:15:42.951898 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqz5k\" (UniqueName: \"kubernetes.io/projected/3874aa74-b4b4-48e8-ab72-e1b5a7f6f969-kube-api-access-xqz5k\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:42.951957 master-0 kubenswrapper[17644]: I0319 12:15:42.951908 17644 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/45342c9f-825e-4171-a078-3b175362c618-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:42.951957 master-0 kubenswrapper[17644]: I0319 12:15:42.951916 17644 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/157911ad-1a9e-46fa-bf3e-bd268714ea05-util\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:42.951957 master-0 kubenswrapper[17644]: I0319 12:15:42.951924 17644 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/45342c9f-825e-4171-a078-3b175362c618-util\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:42.951957 master-0 kubenswrapper[17644]: I0319 12:15:42.951935 17644 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3874aa74-b4b4-48e8-ab72-e1b5a7f6f969-util\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:43.238032 master-0 kubenswrapper[17644]: I0319 12:15:43.237786 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff" event={"ID":"157911ad-1a9e-46fa-bf3e-bd268714ea05","Type":"ContainerDied","Data":"b5902978403ced3d282b8bb42f9141631dc7407d63d31ca4b8d25f3611845cab"} Mar 19 12:15:43.238032 master-0 kubenswrapper[17644]: I0319 12:15:43.237877 17644 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5902978403ced3d282b8bb42f9141631dc7407d63d31ca4b8d25f3611845cab" Mar 19 12:15:43.238032 master-0 kubenswrapper[17644]: I0319 12:15:43.237976 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55dqff" Mar 19 12:15:43.242271 master-0 kubenswrapper[17644]: I0319 12:15:43.242211 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6" event={"ID":"3874aa74-b4b4-48e8-ab72-e1b5a7f6f969","Type":"ContainerDied","Data":"90598356b8ec2271daef2b3147ba148b8db301d55768cd2b193a4c111641369a"} Mar 19 12:15:43.242271 master-0 kubenswrapper[17644]: I0319 12:15:43.242261 17644 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90598356b8ec2271daef2b3147ba148b8db301d55768cd2b193a4c111641369a" Mar 19 12:15:43.242501 master-0 kubenswrapper[17644]: I0319 12:15:43.242299 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pxct6" Mar 19 12:15:43.245517 master-0 kubenswrapper[17644]: I0319 12:15:43.245412 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc" event={"ID":"45342c9f-825e-4171-a078-3b175362c618","Type":"ContainerDied","Data":"5f449fb4ad4a8f55c75c6c3fcd8a173f715941678498837b596529d90ab4ac9e"} Mar 19 12:15:43.245517 master-0 kubenswrapper[17644]: I0319 12:15:43.245474 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16mkmc" Mar 19 12:15:43.245996 master-0 kubenswrapper[17644]: I0319 12:15:43.245485 17644 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f449fb4ad4a8f55c75c6c3fcd8a173f715941678498837b596529d90ab4ac9e" Mar 19 12:15:43.248465 master-0 kubenswrapper[17644]: I0319 12:15:43.248399 17644 generic.go:334] "Generic (PLEG): container finished" podID="28a864cc-db2e-4316-ad0f-1cd948ba2214" containerID="059c67145b150be9a637643c0c7faa19e626e4def09a931cf4958d07a1a2c157" exitCode=0 Mar 19 12:15:43.248611 master-0 kubenswrapper[17644]: I0319 12:15:43.248470 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr" event={"ID":"28a864cc-db2e-4316-ad0f-1cd948ba2214","Type":"ContainerDied","Data":"059c67145b150be9a637643c0c7faa19e626e4def09a931cf4958d07a1a2c157"} Mar 19 12:15:45.271791 master-0 kubenswrapper[17644]: I0319 12:15:45.271701 17644 generic.go:334] "Generic (PLEG): container finished" podID="28a864cc-db2e-4316-ad0f-1cd948ba2214" containerID="734b84b7b95427335dfc551c784429043fd1e626466b95e374cd1170c1534ed8" exitCode=0 Mar 19 12:15:45.271791 master-0 kubenswrapper[17644]: I0319 12:15:45.271768 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr" event={"ID":"28a864cc-db2e-4316-ad0f-1cd948ba2214","Type":"ContainerDied","Data":"734b84b7b95427335dfc551c784429043fd1e626466b95e374cd1170c1534ed8"} Mar 19 12:15:46.308878 master-0 kubenswrapper[17644]: I0319 12:15:46.308801 17644 generic.go:334] "Generic (PLEG): container finished" podID="28a864cc-db2e-4316-ad0f-1cd948ba2214" containerID="075c9eed594f37c291d36bf27ec89ea956eb059216d1ea181ec5e859e1f40af6" exitCode=0 Mar 19 12:15:46.310321 master-0 kubenswrapper[17644]: I0319 12:15:46.309895 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr" event={"ID":"28a864cc-db2e-4316-ad0f-1cd948ba2214","Type":"ContainerDied","Data":"075c9eed594f37c291d36bf27ec89ea956eb059216d1ea181ec5e859e1f40af6"} Mar 19 12:15:47.691717 master-0 kubenswrapper[17644]: I0319 12:15:47.691639 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr" Mar 19 12:15:47.860081 master-0 kubenswrapper[17644]: I0319 12:15:47.859954 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdfw9\" (UniqueName: \"kubernetes.io/projected/28a864cc-db2e-4316-ad0f-1cd948ba2214-kube-api-access-rdfw9\") pod \"28a864cc-db2e-4316-ad0f-1cd948ba2214\" (UID: \"28a864cc-db2e-4316-ad0f-1cd948ba2214\") " Mar 19 12:15:47.860786 master-0 kubenswrapper[17644]: I0319 12:15:47.860713 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28a864cc-db2e-4316-ad0f-1cd948ba2214-bundle\") pod \"28a864cc-db2e-4316-ad0f-1cd948ba2214\" (UID: \"28a864cc-db2e-4316-ad0f-1cd948ba2214\") " Mar 19 12:15:47.861157 master-0 kubenswrapper[17644]: I0319 12:15:47.860928 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28a864cc-db2e-4316-ad0f-1cd948ba2214-util\") pod \"28a864cc-db2e-4316-ad0f-1cd948ba2214\" (UID: \"28a864cc-db2e-4316-ad0f-1cd948ba2214\") " Mar 19 12:15:47.862860 master-0 kubenswrapper[17644]: I0319 12:15:47.862616 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28a864cc-db2e-4316-ad0f-1cd948ba2214-kube-api-access-rdfw9" (OuterVolumeSpecName: "kube-api-access-rdfw9") pod "28a864cc-db2e-4316-ad0f-1cd948ba2214" (UID: "28a864cc-db2e-4316-ad0f-1cd948ba2214"). InnerVolumeSpecName "kube-api-access-rdfw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:15:47.866471 master-0 kubenswrapper[17644]: I0319 12:15:47.866054 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28a864cc-db2e-4316-ad0f-1cd948ba2214-bundle" (OuterVolumeSpecName: "bundle") pod "28a864cc-db2e-4316-ad0f-1cd948ba2214" (UID: "28a864cc-db2e-4316-ad0f-1cd948ba2214"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:15:47.888218 master-0 kubenswrapper[17644]: I0319 12:15:47.887851 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28a864cc-db2e-4316-ad0f-1cd948ba2214-util" (OuterVolumeSpecName: "util") pod "28a864cc-db2e-4316-ad0f-1cd948ba2214" (UID: "28a864cc-db2e-4316-ad0f-1cd948ba2214"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:15:47.964068 master-0 kubenswrapper[17644]: I0319 12:15:47.963953 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdfw9\" (UniqueName: \"kubernetes.io/projected/28a864cc-db2e-4316-ad0f-1cd948ba2214-kube-api-access-rdfw9\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:47.964068 master-0 kubenswrapper[17644]: I0319 12:15:47.964056 17644 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28a864cc-db2e-4316-ad0f-1cd948ba2214-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:47.964355 master-0 kubenswrapper[17644]: I0319 12:15:47.964094 17644 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28a864cc-db2e-4316-ad0f-1cd948ba2214-util\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:48.324145 master-0 kubenswrapper[17644]: I0319 12:15:48.323987 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr" event={"ID":"28a864cc-db2e-4316-ad0f-1cd948ba2214","Type":"ContainerDied","Data":"415a60c5e6474c455bb1428308858267da49bb684dfc87b8dabb4823b2fe811b"} Mar 19 12:15:48.324145 master-0 kubenswrapper[17644]: I0319 12:15:48.324071 17644 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="415a60c5e6474c455bb1428308858267da49bb684dfc87b8dabb4823b2fe811b" Mar 19 12:15:48.324145 master-0 kubenswrapper[17644]: I0319 12:15:48.324071 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726pc7vr" Mar 19 12:15:55.587924 master-0 kubenswrapper[17644]: I0319 12:15:55.587865 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j5cbt"] Mar 19 12:15:55.588509 master-0 kubenswrapper[17644]: E0319 12:15:55.588205 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3874aa74-b4b4-48e8-ab72-e1b5a7f6f969" containerName="util" Mar 19 12:15:55.588509 master-0 kubenswrapper[17644]: I0319 12:15:55.588224 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="3874aa74-b4b4-48e8-ab72-e1b5a7f6f969" containerName="util" Mar 19 12:15:55.588509 master-0 kubenswrapper[17644]: E0319 12:15:55.588240 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45342c9f-825e-4171-a078-3b175362c618" containerName="pull" Mar 19 12:15:55.588509 master-0 kubenswrapper[17644]: I0319 12:15:55.588250 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="45342c9f-825e-4171-a078-3b175362c618" containerName="pull" Mar 19 12:15:55.588509 master-0 kubenswrapper[17644]: E0319 12:15:55.588270 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a864cc-db2e-4316-ad0f-1cd948ba2214" containerName="pull" Mar 19 12:15:55.588509 master-0 kubenswrapper[17644]: I0319 12:15:55.588278 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a864cc-db2e-4316-ad0f-1cd948ba2214" containerName="pull" Mar 19 12:15:55.588509 master-0 kubenswrapper[17644]: E0319 12:15:55.588293 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157911ad-1a9e-46fa-bf3e-bd268714ea05" containerName="extract" Mar 19 12:15:55.588509 master-0 kubenswrapper[17644]: I0319 12:15:55.588301 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="157911ad-1a9e-46fa-bf3e-bd268714ea05" containerName="extract" Mar 19 12:15:55.588509 master-0 kubenswrapper[17644]: E0319 12:15:55.588318 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45342c9f-825e-4171-a078-3b175362c618" containerName="util" Mar 19 12:15:55.588509 master-0 kubenswrapper[17644]: I0319 12:15:55.588326 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="45342c9f-825e-4171-a078-3b175362c618" containerName="util" Mar 19 12:15:55.588509 master-0 kubenswrapper[17644]: E0319 12:15:55.588342 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157911ad-1a9e-46fa-bf3e-bd268714ea05" containerName="pull" Mar 19 12:15:55.588509 master-0 kubenswrapper[17644]: I0319 12:15:55.588350 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="157911ad-1a9e-46fa-bf3e-bd268714ea05" containerName="pull" Mar 19 12:15:55.588509 master-0 kubenswrapper[17644]: E0319 12:15:55.588361 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a864cc-db2e-4316-ad0f-1cd948ba2214" containerName="extract" Mar 19 12:15:55.588509 master-0 kubenswrapper[17644]: I0319 12:15:55.588370 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a864cc-db2e-4316-ad0f-1cd948ba2214" containerName="extract" Mar 19 12:15:55.588509 master-0 kubenswrapper[17644]: E0319 12:15:55.588391 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3874aa74-b4b4-48e8-ab72-e1b5a7f6f969" containerName="pull" Mar 19 12:15:55.588509 master-0 kubenswrapper[17644]: I0319 12:15:55.588398 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="3874aa74-b4b4-48e8-ab72-e1b5a7f6f969" containerName="pull" Mar 19 12:15:55.588509 master-0 kubenswrapper[17644]: E0319 12:15:55.588426 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157911ad-1a9e-46fa-bf3e-bd268714ea05" containerName="util" Mar 19 12:15:55.588509 master-0 kubenswrapper[17644]: I0319 12:15:55.588433 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="157911ad-1a9e-46fa-bf3e-bd268714ea05" containerName="util" Mar 19 12:15:55.588509 master-0 kubenswrapper[17644]: E0319 12:15:55.588446 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3874aa74-b4b4-48e8-ab72-e1b5a7f6f969" containerName="extract" Mar 19 12:15:55.588509 master-0 kubenswrapper[17644]: I0319 12:15:55.588454 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="3874aa74-b4b4-48e8-ab72-e1b5a7f6f969" containerName="extract" Mar 19 12:15:55.588509 master-0 kubenswrapper[17644]: E0319 12:15:55.588466 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45342c9f-825e-4171-a078-3b175362c618" containerName="extract" Mar 19 12:15:55.588509 master-0 kubenswrapper[17644]: I0319 12:15:55.588473 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="45342c9f-825e-4171-a078-3b175362c618" containerName="extract" Mar 19 12:15:55.588509 master-0 kubenswrapper[17644]: E0319 12:15:55.588489 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a864cc-db2e-4316-ad0f-1cd948ba2214" containerName="util" Mar 19 12:15:55.588509 master-0 kubenswrapper[17644]: I0319 12:15:55.588496 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a864cc-db2e-4316-ad0f-1cd948ba2214" containerName="util" Mar 19 12:15:55.589183 master-0 kubenswrapper[17644]: I0319 12:15:55.588635 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="45342c9f-825e-4171-a078-3b175362c618" containerName="extract" Mar 19 12:15:55.589183 master-0 kubenswrapper[17644]: I0319 12:15:55.588657 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="157911ad-1a9e-46fa-bf3e-bd268714ea05" containerName="extract" Mar 19 12:15:55.589183 master-0 kubenswrapper[17644]: I0319 12:15:55.588683 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="28a864cc-db2e-4316-ad0f-1cd948ba2214" containerName="extract" Mar 19 12:15:55.589183 master-0 kubenswrapper[17644]: I0319 12:15:55.588706 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="3874aa74-b4b4-48e8-ab72-e1b5a7f6f969" containerName="extract" Mar 19 12:15:55.591988 master-0 kubenswrapper[17644]: I0319 12:15:55.591938 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j5cbt" Mar 19 12:15:55.596164 master-0 kubenswrapper[17644]: I0319 12:15:55.596124 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 19 12:15:55.596257 master-0 kubenswrapper[17644]: I0319 12:15:55.596173 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 19 12:15:55.611482 master-0 kubenswrapper[17644]: I0319 12:15:55.611432 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j5cbt"] Mar 19 12:15:55.684551 master-0 kubenswrapper[17644]: I0319 12:15:55.684503 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2734d06-38bb-4c40-a515-431d55a09d1e-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-j5cbt\" (UID: \"b2734d06-38bb-4c40-a515-431d55a09d1e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j5cbt" Mar 19 12:15:55.684856 master-0 kubenswrapper[17644]: I0319 12:15:55.684838 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcrdq\" (UniqueName: \"kubernetes.io/projected/b2734d06-38bb-4c40-a515-431d55a09d1e-kube-api-access-fcrdq\") pod \"cert-manager-operator-controller-manager-66c8bdd694-j5cbt\" (UID: \"b2734d06-38bb-4c40-a515-431d55a09d1e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j5cbt" Mar 19 12:15:55.786655 master-0 kubenswrapper[17644]: I0319 12:15:55.786570 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2734d06-38bb-4c40-a515-431d55a09d1e-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-j5cbt\" (UID: \"b2734d06-38bb-4c40-a515-431d55a09d1e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j5cbt" Mar 19 12:15:55.786884 master-0 kubenswrapper[17644]: I0319 12:15:55.786714 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcrdq\" (UniqueName: \"kubernetes.io/projected/b2734d06-38bb-4c40-a515-431d55a09d1e-kube-api-access-fcrdq\") pod \"cert-manager-operator-controller-manager-66c8bdd694-j5cbt\" (UID: \"b2734d06-38bb-4c40-a515-431d55a09d1e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j5cbt" Mar 19 12:15:55.788104 master-0 kubenswrapper[17644]: I0319 12:15:55.788077 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2734d06-38bb-4c40-a515-431d55a09d1e-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-j5cbt\" (UID: \"b2734d06-38bb-4c40-a515-431d55a09d1e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j5cbt" Mar 19 12:15:55.819779 master-0 kubenswrapper[17644]: I0319 12:15:55.817698 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcrdq\" (UniqueName: \"kubernetes.io/projected/b2734d06-38bb-4c40-a515-431d55a09d1e-kube-api-access-fcrdq\") pod \"cert-manager-operator-controller-manager-66c8bdd694-j5cbt\" (UID: \"b2734d06-38bb-4c40-a515-431d55a09d1e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j5cbt" Mar 19 12:15:55.911329 master-0 kubenswrapper[17644]: I0319 12:15:55.911222 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j5cbt" Mar 19 12:15:56.352887 master-0 kubenswrapper[17644]: W0319 12:15:56.352832 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2734d06_38bb_4c40_a515_431d55a09d1e.slice/crio-3a1dc3fb63cd9ac11a7477bea415ba3f00f8dbad12f31d8a5aadb4f5ced6fdf7 WatchSource:0}: Error finding container 3a1dc3fb63cd9ac11a7477bea415ba3f00f8dbad12f31d8a5aadb4f5ced6fdf7: Status 404 returned error can't find the container with id 3a1dc3fb63cd9ac11a7477bea415ba3f00f8dbad12f31d8a5aadb4f5ced6fdf7 Mar 19 12:15:56.359996 master-0 kubenswrapper[17644]: I0319 12:15:56.359947 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j5cbt"] Mar 19 12:15:56.379819 master-0 kubenswrapper[17644]: I0319 12:15:56.379749 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j5cbt" event={"ID":"b2734d06-38bb-4c40-a515-431d55a09d1e","Type":"ContainerStarted","Data":"3a1dc3fb63cd9ac11a7477bea415ba3f00f8dbad12f31d8a5aadb4f5ced6fdf7"} Mar 19 12:15:58.997434 master-0 kubenswrapper[17644]: I0319 12:15:58.997314 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-8ht49"] Mar 19 12:15:58.998620 master-0 kubenswrapper[17644]: I0319 12:15:58.998592 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-8ht49" Mar 19 12:15:59.002343 master-0 kubenswrapper[17644]: I0319 12:15:59.002014 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 19 12:15:59.002343 master-0 kubenswrapper[17644]: I0319 12:15:59.002218 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 19 12:15:59.030301 master-0 kubenswrapper[17644]: I0319 12:15:59.030234 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-8ht49"] Mar 19 12:15:59.157050 master-0 kubenswrapper[17644]: I0319 12:15:59.156996 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lnh2\" (UniqueName: \"kubernetes.io/projected/8de64a53-181c-4b60-a814-c8f104593009-kube-api-access-5lnh2\") pod \"nmstate-operator-796d4cfff4-8ht49\" (UID: \"8de64a53-181c-4b60-a814-c8f104593009\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-8ht49" Mar 19 12:15:59.258571 master-0 kubenswrapper[17644]: I0319 12:15:59.258154 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lnh2\" (UniqueName: \"kubernetes.io/projected/8de64a53-181c-4b60-a814-c8f104593009-kube-api-access-5lnh2\") pod \"nmstate-operator-796d4cfff4-8ht49\" (UID: \"8de64a53-181c-4b60-a814-c8f104593009\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-8ht49" Mar 19 12:15:59.294427 master-0 kubenswrapper[17644]: I0319 12:15:59.294373 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lnh2\" (UniqueName: \"kubernetes.io/projected/8de64a53-181c-4b60-a814-c8f104593009-kube-api-access-5lnh2\") pod \"nmstate-operator-796d4cfff4-8ht49\" (UID: \"8de64a53-181c-4b60-a814-c8f104593009\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-8ht49" Mar 19 12:15:59.325062 master-0 kubenswrapper[17644]: I0319 12:15:59.324994 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-8ht49" Mar 19 12:15:59.852256 master-0 kubenswrapper[17644]: I0319 12:15:59.849840 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-8ht49"] Mar 19 12:16:00.414631 master-0 kubenswrapper[17644]: I0319 12:16:00.414575 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-8ht49" event={"ID":"8de64a53-181c-4b60-a814-c8f104593009","Type":"ContainerStarted","Data":"e222a529673b0b5824f75097cbe8af9346bb8cf7bc0150874c4fdbd15e0342e6"} Mar 19 12:16:00.416214 master-0 kubenswrapper[17644]: I0319 12:16:00.416165 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j5cbt" event={"ID":"b2734d06-38bb-4c40-a515-431d55a09d1e","Type":"ContainerStarted","Data":"d66e9839aebd4dec19c4e734ccd26adc159c123cede10c840bdd1fdcc609109c"} Mar 19 12:16:00.440318 master-0 kubenswrapper[17644]: I0319 12:16:00.440238 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j5cbt" podStartSLOduration=2.417107771 podStartE2EDuration="5.440217855s" podCreationTimestamp="2026-03-19 12:15:55 +0000 UTC" firstStartedPulling="2026-03-19 12:15:56.355666312 +0000 UTC m=+990.125624347" lastFinishedPulling="2026-03-19 12:15:59.378776396 +0000 UTC m=+993.148734431" observedRunningTime="2026-03-19 12:16:00.437884948 +0000 UTC m=+994.207843003" watchObservedRunningTime="2026-03-19 12:16:00.440217855 +0000 UTC m=+994.210175910" Mar 19 12:16:01.276448 master-0 kubenswrapper[17644]: I0319 12:16:01.276392 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7c8d7d7bcf-bpkwf"] Mar 19 12:16:01.277647 master-0 kubenswrapper[17644]: I0319 12:16:01.277626 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7c8d7d7bcf-bpkwf" Mar 19 12:16:01.281995 master-0 kubenswrapper[17644]: I0319 12:16:01.281681 17644 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 19 12:16:01.282128 master-0 kubenswrapper[17644]: I0319 12:16:01.282038 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 19 12:16:01.283898 master-0 kubenswrapper[17644]: I0319 12:16:01.283839 17644 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 19 12:16:01.284830 master-0 kubenswrapper[17644]: I0319 12:16:01.284806 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 19 12:16:01.296600 master-0 kubenswrapper[17644]: I0319 12:16:01.296559 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7c8d7d7bcf-bpkwf"] Mar 19 12:16:01.395206 master-0 kubenswrapper[17644]: I0319 12:16:01.394674 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7bed1df9-51fd-4f70-95e9-4ec7333995d1-apiservice-cert\") pod \"metallb-operator-controller-manager-7c8d7d7bcf-bpkwf\" (UID: \"7bed1df9-51fd-4f70-95e9-4ec7333995d1\") " pod="metallb-system/metallb-operator-controller-manager-7c8d7d7bcf-bpkwf" Mar 19 12:16:01.395206 master-0 kubenswrapper[17644]: I0319 12:16:01.394782 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnbp6\" (UniqueName: \"kubernetes.io/projected/7bed1df9-51fd-4f70-95e9-4ec7333995d1-kube-api-access-pnbp6\") pod \"metallb-operator-controller-manager-7c8d7d7bcf-bpkwf\" (UID: \"7bed1df9-51fd-4f70-95e9-4ec7333995d1\") " pod="metallb-system/metallb-operator-controller-manager-7c8d7d7bcf-bpkwf" Mar 19 12:16:01.395206 master-0 kubenswrapper[17644]: I0319 12:16:01.394830 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7bed1df9-51fd-4f70-95e9-4ec7333995d1-webhook-cert\") pod \"metallb-operator-controller-manager-7c8d7d7bcf-bpkwf\" (UID: \"7bed1df9-51fd-4f70-95e9-4ec7333995d1\") " pod="metallb-system/metallb-operator-controller-manager-7c8d7d7bcf-bpkwf" Mar 19 12:16:01.497002 master-0 kubenswrapper[17644]: I0319 12:16:01.496925 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7bed1df9-51fd-4f70-95e9-4ec7333995d1-webhook-cert\") pod \"metallb-operator-controller-manager-7c8d7d7bcf-bpkwf\" (UID: \"7bed1df9-51fd-4f70-95e9-4ec7333995d1\") " pod="metallb-system/metallb-operator-controller-manager-7c8d7d7bcf-bpkwf" Mar 19 12:16:01.497559 master-0 kubenswrapper[17644]: I0319 12:16:01.497093 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7bed1df9-51fd-4f70-95e9-4ec7333995d1-apiservice-cert\") pod \"metallb-operator-controller-manager-7c8d7d7bcf-bpkwf\" (UID: \"7bed1df9-51fd-4f70-95e9-4ec7333995d1\") " pod="metallb-system/metallb-operator-controller-manager-7c8d7d7bcf-bpkwf" Mar 19 12:16:01.497668 master-0 kubenswrapper[17644]: I0319 12:16:01.497629 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnbp6\" (UniqueName: \"kubernetes.io/projected/7bed1df9-51fd-4f70-95e9-4ec7333995d1-kube-api-access-pnbp6\") pod \"metallb-operator-controller-manager-7c8d7d7bcf-bpkwf\" (UID: \"7bed1df9-51fd-4f70-95e9-4ec7333995d1\") " pod="metallb-system/metallb-operator-controller-manager-7c8d7d7bcf-bpkwf" Mar 19 12:16:01.502510 master-0 kubenswrapper[17644]: I0319 12:16:01.502393 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7bed1df9-51fd-4f70-95e9-4ec7333995d1-apiservice-cert\") pod \"metallb-operator-controller-manager-7c8d7d7bcf-bpkwf\" (UID: \"7bed1df9-51fd-4f70-95e9-4ec7333995d1\") " pod="metallb-system/metallb-operator-controller-manager-7c8d7d7bcf-bpkwf" Mar 19 12:16:01.508436 master-0 kubenswrapper[17644]: I0319 12:16:01.508311 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7bed1df9-51fd-4f70-95e9-4ec7333995d1-webhook-cert\") pod \"metallb-operator-controller-manager-7c8d7d7bcf-bpkwf\" (UID: \"7bed1df9-51fd-4f70-95e9-4ec7333995d1\") " pod="metallb-system/metallb-operator-controller-manager-7c8d7d7bcf-bpkwf" Mar 19 12:16:01.650023 master-0 kubenswrapper[17644]: I0319 12:16:01.649966 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnbp6\" (UniqueName: \"kubernetes.io/projected/7bed1df9-51fd-4f70-95e9-4ec7333995d1-kube-api-access-pnbp6\") pod \"metallb-operator-controller-manager-7c8d7d7bcf-bpkwf\" (UID: \"7bed1df9-51fd-4f70-95e9-4ec7333995d1\") " pod="metallb-system/metallb-operator-controller-manager-7c8d7d7bcf-bpkwf" Mar 19 12:16:01.907702 master-0 kubenswrapper[17644]: I0319 12:16:01.906546 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7c8d7d7bcf-bpkwf" Mar 19 12:16:01.996451 master-0 kubenswrapper[17644]: I0319 12:16:01.995193 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b47fbc9b4-zf5jl"] Mar 19 12:16:01.996451 master-0 kubenswrapper[17644]: I0319 12:16:01.996068 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5b47fbc9b4-zf5jl" Mar 19 12:16:02.001535 master-0 kubenswrapper[17644]: I0319 12:16:02.000297 17644 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 19 12:16:02.001535 master-0 kubenswrapper[17644]: I0319 12:16:02.000495 17644 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 19 12:16:02.018815 master-0 kubenswrapper[17644]: I0319 12:16:02.017293 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b47fbc9b4-zf5jl"] Mar 19 12:16:02.116303 master-0 kubenswrapper[17644]: I0319 12:16:02.111760 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwbkc\" (UniqueName: \"kubernetes.io/projected/e87720f9-bdd2-4397-808c-b51869af7cfe-kube-api-access-dwbkc\") pod \"metallb-operator-webhook-server-5b47fbc9b4-zf5jl\" (UID: \"e87720f9-bdd2-4397-808c-b51869af7cfe\") " pod="metallb-system/metallb-operator-webhook-server-5b47fbc9b4-zf5jl" Mar 19 12:16:02.116303 master-0 kubenswrapper[17644]: I0319 12:16:02.112174 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e87720f9-bdd2-4397-808c-b51869af7cfe-webhook-cert\") pod \"metallb-operator-webhook-server-5b47fbc9b4-zf5jl\" (UID: \"e87720f9-bdd2-4397-808c-b51869af7cfe\") " pod="metallb-system/metallb-operator-webhook-server-5b47fbc9b4-zf5jl" Mar 19 12:16:02.116303 master-0 kubenswrapper[17644]: I0319 12:16:02.112321 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e87720f9-bdd2-4397-808c-b51869af7cfe-apiservice-cert\") pod \"metallb-operator-webhook-server-5b47fbc9b4-zf5jl\" (UID: \"e87720f9-bdd2-4397-808c-b51869af7cfe\") " pod="metallb-system/metallb-operator-webhook-server-5b47fbc9b4-zf5jl" Mar 19 12:16:02.215769 master-0 kubenswrapper[17644]: I0319 12:16:02.213917 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e87720f9-bdd2-4397-808c-b51869af7cfe-webhook-cert\") pod \"metallb-operator-webhook-server-5b47fbc9b4-zf5jl\" (UID: \"e87720f9-bdd2-4397-808c-b51869af7cfe\") " pod="metallb-system/metallb-operator-webhook-server-5b47fbc9b4-zf5jl" Mar 19 12:16:02.215769 master-0 kubenswrapper[17644]: I0319 12:16:02.214022 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e87720f9-bdd2-4397-808c-b51869af7cfe-apiservice-cert\") pod \"metallb-operator-webhook-server-5b47fbc9b4-zf5jl\" (UID: \"e87720f9-bdd2-4397-808c-b51869af7cfe\") " pod="metallb-system/metallb-operator-webhook-server-5b47fbc9b4-zf5jl" Mar 19 12:16:02.215769 master-0 kubenswrapper[17644]: I0319 12:16:02.214162 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwbkc\" (UniqueName: \"kubernetes.io/projected/e87720f9-bdd2-4397-808c-b51869af7cfe-kube-api-access-dwbkc\") pod \"metallb-operator-webhook-server-5b47fbc9b4-zf5jl\" (UID: \"e87720f9-bdd2-4397-808c-b51869af7cfe\") " pod="metallb-system/metallb-operator-webhook-server-5b47fbc9b4-zf5jl" Mar 19 12:16:02.221758 master-0 kubenswrapper[17644]: I0319 12:16:02.220852 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e87720f9-bdd2-4397-808c-b51869af7cfe-webhook-cert\") pod \"metallb-operator-webhook-server-5b47fbc9b4-zf5jl\" (UID: \"e87720f9-bdd2-4397-808c-b51869af7cfe\") " pod="metallb-system/metallb-operator-webhook-server-5b47fbc9b4-zf5jl" Mar 19 12:16:02.221758 master-0 kubenswrapper[17644]: I0319 12:16:02.221640 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e87720f9-bdd2-4397-808c-b51869af7cfe-apiservice-cert\") pod \"metallb-operator-webhook-server-5b47fbc9b4-zf5jl\" (UID: \"e87720f9-bdd2-4397-808c-b51869af7cfe\") " pod="metallb-system/metallb-operator-webhook-server-5b47fbc9b4-zf5jl" Mar 19 12:16:02.432842 master-0 kubenswrapper[17644]: I0319 12:16:02.429255 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwbkc\" (UniqueName: \"kubernetes.io/projected/e87720f9-bdd2-4397-808c-b51869af7cfe-kube-api-access-dwbkc\") pod \"metallb-operator-webhook-server-5b47fbc9b4-zf5jl\" (UID: \"e87720f9-bdd2-4397-808c-b51869af7cfe\") " pod="metallb-system/metallb-operator-webhook-server-5b47fbc9b4-zf5jl" Mar 19 12:16:02.616530 master-0 kubenswrapper[17644]: I0319 12:16:02.616480 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7c8d7d7bcf-bpkwf"] Mar 19 12:16:02.653722 master-0 kubenswrapper[17644]: I0319 12:16:02.653544 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5b47fbc9b4-zf5jl" Mar 19 12:16:03.386942 master-0 kubenswrapper[17644]: I0319 12:16:03.386376 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b47fbc9b4-zf5jl"] Mar 19 12:16:03.417202 master-0 kubenswrapper[17644]: I0319 12:16:03.415885 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-qgj2h"] Mar 19 12:16:03.417202 master-0 kubenswrapper[17644]: I0319 12:16:03.416760 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-qgj2h" Mar 19 12:16:03.424754 master-0 kubenswrapper[17644]: I0319 12:16:03.424297 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 19 12:16:03.424754 master-0 kubenswrapper[17644]: I0319 12:16:03.424339 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 19 12:16:03.444763 master-0 kubenswrapper[17644]: I0319 12:16:03.442658 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-qgj2h"] Mar 19 12:16:03.487814 master-0 kubenswrapper[17644]: I0319 12:16:03.484789 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7c8d7d7bcf-bpkwf" event={"ID":"7bed1df9-51fd-4f70-95e9-4ec7333995d1","Type":"ContainerStarted","Data":"c1936a6c1644eddb9fb16d24300981a7915e6081b7bc470b8df4ccfe00efd1c7"} Mar 19 12:16:03.525747 master-0 kubenswrapper[17644]: I0319 12:16:03.520103 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5b47fbc9b4-zf5jl" event={"ID":"e87720f9-bdd2-4397-808c-b51869af7cfe","Type":"ContainerStarted","Data":"eb434298c676a8ae2f996d3e893dec9080ba73be2efcf0c7e47577687760a6b8"} Mar 19 12:16:03.567142 master-0 kubenswrapper[17644]: I0319 12:16:03.566688 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chrzd\" (UniqueName: \"kubernetes.io/projected/d98f87fe-cea4-41a5-9000-743954979694-kube-api-access-chrzd\") pod \"cert-manager-webhook-6888856db4-qgj2h\" (UID: \"d98f87fe-cea4-41a5-9000-743954979694\") " pod="cert-manager/cert-manager-webhook-6888856db4-qgj2h" Mar 19 12:16:03.567142 master-0 kubenswrapper[17644]: I0319 12:16:03.566795 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d98f87fe-cea4-41a5-9000-743954979694-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-qgj2h\" (UID: \"d98f87fe-cea4-41a5-9000-743954979694\") " pod="cert-manager/cert-manager-webhook-6888856db4-qgj2h" Mar 19 12:16:03.671796 master-0 kubenswrapper[17644]: I0319 12:16:03.668351 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d98f87fe-cea4-41a5-9000-743954979694-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-qgj2h\" (UID: \"d98f87fe-cea4-41a5-9000-743954979694\") " pod="cert-manager/cert-manager-webhook-6888856db4-qgj2h" Mar 19 12:16:03.671796 master-0 kubenswrapper[17644]: I0319 12:16:03.668502 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chrzd\" (UniqueName: \"kubernetes.io/projected/d98f87fe-cea4-41a5-9000-743954979694-kube-api-access-chrzd\") pod \"cert-manager-webhook-6888856db4-qgj2h\" (UID: \"d98f87fe-cea4-41a5-9000-743954979694\") " pod="cert-manager/cert-manager-webhook-6888856db4-qgj2h" Mar 19 12:16:03.703503 master-0 kubenswrapper[17644]: I0319 12:16:03.703044 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chrzd\" (UniqueName: \"kubernetes.io/projected/d98f87fe-cea4-41a5-9000-743954979694-kube-api-access-chrzd\") pod \"cert-manager-webhook-6888856db4-qgj2h\" (UID: \"d98f87fe-cea4-41a5-9000-743954979694\") " pod="cert-manager/cert-manager-webhook-6888856db4-qgj2h" Mar 19 12:16:03.704566 master-0 kubenswrapper[17644]: I0319 12:16:03.704234 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d98f87fe-cea4-41a5-9000-743954979694-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-qgj2h\" (UID: \"d98f87fe-cea4-41a5-9000-743954979694\") " pod="cert-manager/cert-manager-webhook-6888856db4-qgj2h" Mar 19 12:16:03.780049 master-0 kubenswrapper[17644]: I0319 12:16:03.771926 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-qgj2h" Mar 19 12:16:04.621747 master-0 kubenswrapper[17644]: I0319 12:16:04.618435 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-qgj2h"] Mar 19 12:16:05.368800 master-0 kubenswrapper[17644]: I0319 12:16:05.368191 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-rr2cw"] Mar 19 12:16:05.369531 master-0 kubenswrapper[17644]: I0319 12:16:05.369216 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-rr2cw" Mar 19 12:16:05.395601 master-0 kubenswrapper[17644]: I0319 12:16:05.394900 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-rr2cw"] Mar 19 12:16:05.404093 master-0 kubenswrapper[17644]: I0319 12:16:05.404036 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3fa634e-d357-45b8-b0bd-8ab6b961de7b-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-rr2cw\" (UID: \"a3fa634e-d357-45b8-b0bd-8ab6b961de7b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-rr2cw" Mar 19 12:16:05.404303 master-0 kubenswrapper[17644]: I0319 12:16:05.404155 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nwc6\" (UniqueName: \"kubernetes.io/projected/a3fa634e-d357-45b8-b0bd-8ab6b961de7b-kube-api-access-6nwc6\") pod \"cert-manager-cainjector-5545bd876-rr2cw\" (UID: \"a3fa634e-d357-45b8-b0bd-8ab6b961de7b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-rr2cw" Mar 19 12:16:05.506755 master-0 kubenswrapper[17644]: I0319 12:16:05.505388 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nwc6\" (UniqueName: \"kubernetes.io/projected/a3fa634e-d357-45b8-b0bd-8ab6b961de7b-kube-api-access-6nwc6\") pod \"cert-manager-cainjector-5545bd876-rr2cw\" (UID: \"a3fa634e-d357-45b8-b0bd-8ab6b961de7b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-rr2cw" Mar 19 12:16:05.506755 master-0 kubenswrapper[17644]: I0319 12:16:05.505495 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3fa634e-d357-45b8-b0bd-8ab6b961de7b-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-rr2cw\" (UID: \"a3fa634e-d357-45b8-b0bd-8ab6b961de7b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-rr2cw" Mar 19 12:16:05.545756 master-0 kubenswrapper[17644]: I0319 12:16:05.543825 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3fa634e-d357-45b8-b0bd-8ab6b961de7b-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-rr2cw\" (UID: \"a3fa634e-d357-45b8-b0bd-8ab6b961de7b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-rr2cw" Mar 19 12:16:05.558807 master-0 kubenswrapper[17644]: I0319 12:16:05.558753 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nwc6\" (UniqueName: \"kubernetes.io/projected/a3fa634e-d357-45b8-b0bd-8ab6b961de7b-kube-api-access-6nwc6\") pod \"cert-manager-cainjector-5545bd876-rr2cw\" (UID: \"a3fa634e-d357-45b8-b0bd-8ab6b961de7b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-rr2cw" Mar 19 12:16:05.687762 master-0 kubenswrapper[17644]: W0319 12:16:05.683867 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd98f87fe_cea4_41a5_9000_743954979694.slice/crio-5195fbbc6fd741717ec5c18b8ebb5d8d1e635a948325363d965bdb72b3ce90e0 WatchSource:0}: Error finding container 5195fbbc6fd741717ec5c18b8ebb5d8d1e635a948325363d965bdb72b3ce90e0: Status 404 returned error can't find the container with id 5195fbbc6fd741717ec5c18b8ebb5d8d1e635a948325363d965bdb72b3ce90e0 Mar 19 12:16:05.694768 master-0 kubenswrapper[17644]: I0319 12:16:05.692931 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-rr2cw" Mar 19 12:16:06.141927 master-0 kubenswrapper[17644]: W0319 12:16:06.141819 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3fa634e_d357_45b8_b0bd_8ab6b961de7b.slice/crio-3e1ad39f8a2fbd604d6d453c94c58892403098ecd9baca2fc82674a5656675a5 WatchSource:0}: Error finding container 3e1ad39f8a2fbd604d6d453c94c58892403098ecd9baca2fc82674a5656675a5: Status 404 returned error can't find the container with id 3e1ad39f8a2fbd604d6d453c94c58892403098ecd9baca2fc82674a5656675a5 Mar 19 12:16:06.150970 master-0 kubenswrapper[17644]: I0319 12:16:06.150927 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-rr2cw"] Mar 19 12:16:06.579019 master-0 kubenswrapper[17644]: I0319 12:16:06.578958 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-qgj2h" event={"ID":"d98f87fe-cea4-41a5-9000-743954979694","Type":"ContainerStarted","Data":"5195fbbc6fd741717ec5c18b8ebb5d8d1e635a948325363d965bdb72b3ce90e0"} Mar 19 12:16:06.608056 master-0 kubenswrapper[17644]: I0319 12:16:06.607949 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-rr2cw" event={"ID":"a3fa634e-d357-45b8-b0bd-8ab6b961de7b","Type":"ContainerStarted","Data":"3e1ad39f8a2fbd604d6d453c94c58892403098ecd9baca2fc82674a5656675a5"} Mar 19 12:16:06.629050 master-0 kubenswrapper[17644]: I0319 12:16:06.623820 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-8ht49" event={"ID":"8de64a53-181c-4b60-a814-c8f104593009","Type":"ContainerStarted","Data":"701cd374950a11a8f9c4bdd3490b658af11f12d593c6582f444bc87413d990a3"} Mar 19 12:16:06.724288 master-0 kubenswrapper[17644]: I0319 12:16:06.723203 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-8ht49" podStartSLOduration=2.783543467 podStartE2EDuration="8.723149668s" podCreationTimestamp="2026-03-19 12:15:58 +0000 UTC" firstStartedPulling="2026-03-19 12:15:59.858299493 +0000 UTC m=+993.628257528" lastFinishedPulling="2026-03-19 12:16:05.797905694 +0000 UTC m=+999.567863729" observedRunningTime="2026-03-19 12:16:06.71623899 +0000 UTC m=+1000.486197035" watchObservedRunningTime="2026-03-19 12:16:06.723149668 +0000 UTC m=+1000.493107703" Mar 19 12:16:14.264501 master-0 kubenswrapper[17644]: I0319 12:16:14.264431 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-msc5g"] Mar 19 12:16:14.270138 master-0 kubenswrapper[17644]: I0319 12:16:14.265413 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-msc5g" Mar 19 12:16:14.270138 master-0 kubenswrapper[17644]: I0319 12:16:14.270019 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 19 12:16:14.270590 master-0 kubenswrapper[17644]: I0319 12:16:14.270512 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 19 12:16:14.309770 master-0 kubenswrapper[17644]: I0319 12:16:14.300486 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-msc5g"] Mar 19 12:16:14.345700 master-0 kubenswrapper[17644]: I0319 12:16:14.343815 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwntz\" (UniqueName: \"kubernetes.io/projected/0edff6cf-09c0-4eba-81fa-a4e78b150269-kube-api-access-qwntz\") pod \"obo-prometheus-operator-8ff7d675-msc5g\" (UID: \"0edff6cf-09c0-4eba-81fa-a4e78b150269\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-msc5g" Mar 19 12:16:14.448512 master-0 kubenswrapper[17644]: I0319 12:16:14.448092 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwntz\" (UniqueName: \"kubernetes.io/projected/0edff6cf-09c0-4eba-81fa-a4e78b150269-kube-api-access-qwntz\") pod \"obo-prometheus-operator-8ff7d675-msc5g\" (UID: \"0edff6cf-09c0-4eba-81fa-a4e78b150269\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-msc5g" Mar 19 12:16:14.486535 master-0 kubenswrapper[17644]: I0319 12:16:14.486486 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwntz\" (UniqueName: \"kubernetes.io/projected/0edff6cf-09c0-4eba-81fa-a4e78b150269-kube-api-access-qwntz\") pod \"obo-prometheus-operator-8ff7d675-msc5g\" (UID: \"0edff6cf-09c0-4eba-81fa-a4e78b150269\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-msc5g" Mar 19 12:16:14.716753 master-0 kubenswrapper[17644]: I0319 12:16:14.711298 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-msc5g" Mar 19 12:16:14.821375 master-0 kubenswrapper[17644]: I0319 12:16:14.821290 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-h2q2d"] Mar 19 12:16:14.822972 master-0 kubenswrapper[17644]: I0319 12:16:14.822938 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-h2q2d" Mar 19 12:16:14.826973 master-0 kubenswrapper[17644]: I0319 12:16:14.826934 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 19 12:16:14.840582 master-0 kubenswrapper[17644]: I0319 12:16:14.840504 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-snwnf"] Mar 19 12:16:14.842128 master-0 kubenswrapper[17644]: I0319 12:16:14.842100 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-snwnf" Mar 19 12:16:14.849929 master-0 kubenswrapper[17644]: I0319 12:16:14.846930 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-h2q2d"] Mar 19 12:16:14.886662 master-0 kubenswrapper[17644]: I0319 12:16:14.880518 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-snwnf"] Mar 19 12:16:14.908751 master-0 kubenswrapper[17644]: I0319 12:16:14.907559 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-84qmh"] Mar 19 12:16:14.908751 master-0 kubenswrapper[17644]: I0319 12:16:14.908651 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-84qmh" Mar 19 12:16:14.938871 master-0 kubenswrapper[17644]: I0319 12:16:14.936794 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-84qmh"] Mar 19 12:16:14.955759 master-0 kubenswrapper[17644]: I0319 12:16:14.955651 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/58c25e67-f473-47f4-b461-e25d7761c102-bound-sa-token\") pod \"cert-manager-545d4d4674-84qmh\" (UID: \"58c25e67-f473-47f4-b461-e25d7761c102\") " pod="cert-manager/cert-manager-545d4d4674-84qmh" Mar 19 12:16:14.956120 master-0 kubenswrapper[17644]: I0319 12:16:14.955799 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6229bbeb-842d-4465-962e-f8148d05cf6f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67df7f5c48-h2q2d\" (UID: \"6229bbeb-842d-4465-962e-f8148d05cf6f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-h2q2d" Mar 19 12:16:14.956120 master-0 kubenswrapper[17644]: I0319 12:16:14.955844 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/56241f4e-fa18-46a7-9be2-5b6d54cd4e26-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67df7f5c48-snwnf\" (UID: \"56241f4e-fa18-46a7-9be2-5b6d54cd4e26\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-snwnf" Mar 19 12:16:14.956120 master-0 kubenswrapper[17644]: I0319 12:16:14.955876 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6229bbeb-842d-4465-962e-f8148d05cf6f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67df7f5c48-h2q2d\" (UID: \"6229bbeb-842d-4465-962e-f8148d05cf6f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-h2q2d" Mar 19 12:16:14.956120 master-0 kubenswrapper[17644]: I0319 12:16:14.955904 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfsq2\" (UniqueName: \"kubernetes.io/projected/58c25e67-f473-47f4-b461-e25d7761c102-kube-api-access-sfsq2\") pod \"cert-manager-545d4d4674-84qmh\" (UID: \"58c25e67-f473-47f4-b461-e25d7761c102\") " pod="cert-manager/cert-manager-545d4d4674-84qmh" Mar 19 12:16:14.956120 master-0 kubenswrapper[17644]: I0319 12:16:14.955926 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/56241f4e-fa18-46a7-9be2-5b6d54cd4e26-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67df7f5c48-snwnf\" (UID: \"56241f4e-fa18-46a7-9be2-5b6d54cd4e26\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-snwnf" Mar 19 12:16:15.057149 master-0 kubenswrapper[17644]: I0319 12:16:15.057034 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/58c25e67-f473-47f4-b461-e25d7761c102-bound-sa-token\") pod \"cert-manager-545d4d4674-84qmh\" (UID: \"58c25e67-f473-47f4-b461-e25d7761c102\") " pod="cert-manager/cert-manager-545d4d4674-84qmh" Mar 19 12:16:15.057149 master-0 kubenswrapper[17644]: I0319 12:16:15.057125 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6229bbeb-842d-4465-962e-f8148d05cf6f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67df7f5c48-h2q2d\" (UID: \"6229bbeb-842d-4465-962e-f8148d05cf6f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-h2q2d" Mar 19 12:16:15.057364 master-0 kubenswrapper[17644]: I0319 12:16:15.057164 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/56241f4e-fa18-46a7-9be2-5b6d54cd4e26-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67df7f5c48-snwnf\" (UID: \"56241f4e-fa18-46a7-9be2-5b6d54cd4e26\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-snwnf" Mar 19 12:16:15.057364 master-0 kubenswrapper[17644]: I0319 12:16:15.057190 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6229bbeb-842d-4465-962e-f8148d05cf6f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67df7f5c48-h2q2d\" (UID: \"6229bbeb-842d-4465-962e-f8148d05cf6f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-h2q2d" Mar 19 12:16:15.057364 master-0 kubenswrapper[17644]: I0319 12:16:15.057219 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfsq2\" (UniqueName: \"kubernetes.io/projected/58c25e67-f473-47f4-b461-e25d7761c102-kube-api-access-sfsq2\") pod \"cert-manager-545d4d4674-84qmh\" (UID: \"58c25e67-f473-47f4-b461-e25d7761c102\") " pod="cert-manager/cert-manager-545d4d4674-84qmh" Mar 19 12:16:15.057364 master-0 kubenswrapper[17644]: I0319 12:16:15.057243 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/56241f4e-fa18-46a7-9be2-5b6d54cd4e26-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67df7f5c48-snwnf\" (UID: \"56241f4e-fa18-46a7-9be2-5b6d54cd4e26\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-snwnf" Mar 19 12:16:15.060534 master-0 kubenswrapper[17644]: I0319 12:16:15.060499 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/56241f4e-fa18-46a7-9be2-5b6d54cd4e26-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67df7f5c48-snwnf\" (UID: \"56241f4e-fa18-46a7-9be2-5b6d54cd4e26\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-snwnf" Mar 19 12:16:15.070248 master-0 kubenswrapper[17644]: I0319 12:16:15.067273 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6229bbeb-842d-4465-962e-f8148d05cf6f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67df7f5c48-h2q2d\" (UID: \"6229bbeb-842d-4465-962e-f8148d05cf6f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-h2q2d" Mar 19 12:16:15.070248 master-0 kubenswrapper[17644]: I0319 12:16:15.069745 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/56241f4e-fa18-46a7-9be2-5b6d54cd4e26-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67df7f5c48-snwnf\" (UID: \"56241f4e-fa18-46a7-9be2-5b6d54cd4e26\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-snwnf" Mar 19 12:16:15.070488 master-0 kubenswrapper[17644]: I0319 12:16:15.070284 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6229bbeb-842d-4465-962e-f8148d05cf6f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67df7f5c48-h2q2d\" (UID: \"6229bbeb-842d-4465-962e-f8148d05cf6f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-h2q2d" Mar 19 12:16:15.088229 master-0 kubenswrapper[17644]: I0319 12:16:15.087579 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfsq2\" (UniqueName: \"kubernetes.io/projected/58c25e67-f473-47f4-b461-e25d7761c102-kube-api-access-sfsq2\") pod \"cert-manager-545d4d4674-84qmh\" (UID: \"58c25e67-f473-47f4-b461-e25d7761c102\") " pod="cert-manager/cert-manager-545d4d4674-84qmh" Mar 19 12:16:15.088229 master-0 kubenswrapper[17644]: I0319 12:16:15.088184 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/58c25e67-f473-47f4-b461-e25d7761c102-bound-sa-token\") pod \"cert-manager-545d4d4674-84qmh\" (UID: \"58c25e67-f473-47f4-b461-e25d7761c102\") " pod="cert-manager/cert-manager-545d4d4674-84qmh" Mar 19 12:16:15.201815 master-0 kubenswrapper[17644]: I0319 12:16:15.196202 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-h2q2d" Mar 19 12:16:15.226785 master-0 kubenswrapper[17644]: I0319 12:16:15.225397 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-dwzhl"] Mar 19 12:16:15.227011 master-0 kubenswrapper[17644]: I0319 12:16:15.226938 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-dwzhl" Mar 19 12:16:15.231225 master-0 kubenswrapper[17644]: I0319 12:16:15.231120 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 19 12:16:15.243415 master-0 kubenswrapper[17644]: I0319 12:16:15.243350 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-snwnf" Mar 19 12:16:15.255437 master-0 kubenswrapper[17644]: I0319 12:16:15.254513 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-84qmh" Mar 19 12:16:15.257689 master-0 kubenswrapper[17644]: I0319 12:16:15.257448 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-dwzhl"] Mar 19 12:16:15.267658 master-0 kubenswrapper[17644]: I0319 12:16:15.267587 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpzrr\" (UniqueName: \"kubernetes.io/projected/7906718f-0803-4c47-a275-e2e02feb34c3-kube-api-access-gpzrr\") pod \"observability-operator-6dd7dd855f-dwzhl\" (UID: \"7906718f-0803-4c47-a275-e2e02feb34c3\") " pod="openshift-operators/observability-operator-6dd7dd855f-dwzhl" Mar 19 12:16:15.268377 master-0 kubenswrapper[17644]: I0319 12:16:15.267679 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/7906718f-0803-4c47-a275-e2e02feb34c3-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-dwzhl\" (UID: \"7906718f-0803-4c47-a275-e2e02feb34c3\") " pod="openshift-operators/observability-operator-6dd7dd855f-dwzhl" Mar 19 12:16:15.382986 master-0 kubenswrapper[17644]: I0319 12:16:15.382826 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpzrr\" (UniqueName: \"kubernetes.io/projected/7906718f-0803-4c47-a275-e2e02feb34c3-kube-api-access-gpzrr\") pod \"observability-operator-6dd7dd855f-dwzhl\" (UID: \"7906718f-0803-4c47-a275-e2e02feb34c3\") " pod="openshift-operators/observability-operator-6dd7dd855f-dwzhl" Mar 19 12:16:15.383319 master-0 kubenswrapper[17644]: I0319 12:16:15.383272 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/7906718f-0803-4c47-a275-e2e02feb34c3-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-dwzhl\" (UID: \"7906718f-0803-4c47-a275-e2e02feb34c3\") " pod="openshift-operators/observability-operator-6dd7dd855f-dwzhl" Mar 19 12:16:15.386643 master-0 kubenswrapper[17644]: I0319 12:16:15.386606 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/7906718f-0803-4c47-a275-e2e02feb34c3-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-dwzhl\" (UID: \"7906718f-0803-4c47-a275-e2e02feb34c3\") " pod="openshift-operators/observability-operator-6dd7dd855f-dwzhl" Mar 19 12:16:15.422778 master-0 kubenswrapper[17644]: I0319 12:16:15.420071 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpzrr\" (UniqueName: \"kubernetes.io/projected/7906718f-0803-4c47-a275-e2e02feb34c3-kube-api-access-gpzrr\") pod \"observability-operator-6dd7dd855f-dwzhl\" (UID: \"7906718f-0803-4c47-a275-e2e02feb34c3\") " pod="openshift-operators/observability-operator-6dd7dd855f-dwzhl" Mar 19 12:16:15.580760 master-0 kubenswrapper[17644]: I0319 12:16:15.580018 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-dwzhl" Mar 19 12:16:15.692029 master-0 kubenswrapper[17644]: I0319 12:16:15.691964 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5554d8fd8f-jkhd6"] Mar 19 12:16:15.703237 master-0 kubenswrapper[17644]: I0319 12:16:15.703138 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5554d8fd8f-jkhd6" Mar 19 12:16:15.711756 master-0 kubenswrapper[17644]: I0319 12:16:15.707010 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-service-cert" Mar 19 12:16:15.769019 master-0 kubenswrapper[17644]: I0319 12:16:15.749509 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5554d8fd8f-jkhd6"] Mar 19 12:16:15.791832 master-0 kubenswrapper[17644]: I0319 12:16:15.791784 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/55660b4d-8a68-4062-bc7a-1216d9be2aa3-apiservice-cert\") pod \"perses-operator-5554d8fd8f-jkhd6\" (UID: \"55660b4d-8a68-4062-bc7a-1216d9be2aa3\") " pod="openshift-operators/perses-operator-5554d8fd8f-jkhd6" Mar 19 12:16:15.792059 master-0 kubenswrapper[17644]: I0319 12:16:15.791902 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/55660b4d-8a68-4062-bc7a-1216d9be2aa3-openshift-service-ca\") pod \"perses-operator-5554d8fd8f-jkhd6\" (UID: \"55660b4d-8a68-4062-bc7a-1216d9be2aa3\") " pod="openshift-operators/perses-operator-5554d8fd8f-jkhd6" Mar 19 12:16:15.792059 master-0 kubenswrapper[17644]: I0319 12:16:15.791929 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fxxn\" (UniqueName: \"kubernetes.io/projected/55660b4d-8a68-4062-bc7a-1216d9be2aa3-kube-api-access-2fxxn\") pod \"perses-operator-5554d8fd8f-jkhd6\" (UID: \"55660b4d-8a68-4062-bc7a-1216d9be2aa3\") " pod="openshift-operators/perses-operator-5554d8fd8f-jkhd6" Mar 19 12:16:15.792059 master-0 kubenswrapper[17644]: I0319 12:16:15.791945 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/55660b4d-8a68-4062-bc7a-1216d9be2aa3-webhook-cert\") pod \"perses-operator-5554d8fd8f-jkhd6\" (UID: \"55660b4d-8a68-4062-bc7a-1216d9be2aa3\") " pod="openshift-operators/perses-operator-5554d8fd8f-jkhd6" Mar 19 12:16:15.896670 master-0 kubenswrapper[17644]: I0319 12:16:15.896616 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/55660b4d-8a68-4062-bc7a-1216d9be2aa3-openshift-service-ca\") pod \"perses-operator-5554d8fd8f-jkhd6\" (UID: \"55660b4d-8a68-4062-bc7a-1216d9be2aa3\") " pod="openshift-operators/perses-operator-5554d8fd8f-jkhd6" Mar 19 12:16:15.896989 master-0 kubenswrapper[17644]: I0319 12:16:15.896680 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fxxn\" (UniqueName: \"kubernetes.io/projected/55660b4d-8a68-4062-bc7a-1216d9be2aa3-kube-api-access-2fxxn\") pod \"perses-operator-5554d8fd8f-jkhd6\" (UID: \"55660b4d-8a68-4062-bc7a-1216d9be2aa3\") " pod="openshift-operators/perses-operator-5554d8fd8f-jkhd6" Mar 19 12:16:15.896989 master-0 kubenswrapper[17644]: I0319 12:16:15.896699 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/55660b4d-8a68-4062-bc7a-1216d9be2aa3-webhook-cert\") pod \"perses-operator-5554d8fd8f-jkhd6\" (UID: \"55660b4d-8a68-4062-bc7a-1216d9be2aa3\") " pod="openshift-operators/perses-operator-5554d8fd8f-jkhd6" Mar 19 12:16:15.896989 master-0 kubenswrapper[17644]: I0319 12:16:15.896739 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/55660b4d-8a68-4062-bc7a-1216d9be2aa3-apiservice-cert\") pod \"perses-operator-5554d8fd8f-jkhd6\" (UID: \"55660b4d-8a68-4062-bc7a-1216d9be2aa3\") " pod="openshift-operators/perses-operator-5554d8fd8f-jkhd6" Mar 19 12:16:15.898347 master-0 kubenswrapper[17644]: I0319 12:16:15.898328 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/55660b4d-8a68-4062-bc7a-1216d9be2aa3-openshift-service-ca\") pod \"perses-operator-5554d8fd8f-jkhd6\" (UID: \"55660b4d-8a68-4062-bc7a-1216d9be2aa3\") " pod="openshift-operators/perses-operator-5554d8fd8f-jkhd6" Mar 19 12:16:15.900609 master-0 kubenswrapper[17644]: I0319 12:16:15.900570 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/55660b4d-8a68-4062-bc7a-1216d9be2aa3-apiservice-cert\") pod \"perses-operator-5554d8fd8f-jkhd6\" (UID: \"55660b4d-8a68-4062-bc7a-1216d9be2aa3\") " pod="openshift-operators/perses-operator-5554d8fd8f-jkhd6" Mar 19 12:16:15.911484 master-0 kubenswrapper[17644]: I0319 12:16:15.911456 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/55660b4d-8a68-4062-bc7a-1216d9be2aa3-webhook-cert\") pod \"perses-operator-5554d8fd8f-jkhd6\" (UID: \"55660b4d-8a68-4062-bc7a-1216d9be2aa3\") " pod="openshift-operators/perses-operator-5554d8fd8f-jkhd6" Mar 19 12:16:15.928348 master-0 kubenswrapper[17644]: I0319 12:16:15.928301 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fxxn\" (UniqueName: \"kubernetes.io/projected/55660b4d-8a68-4062-bc7a-1216d9be2aa3-kube-api-access-2fxxn\") pod \"perses-operator-5554d8fd8f-jkhd6\" (UID: \"55660b4d-8a68-4062-bc7a-1216d9be2aa3\") " pod="openshift-operators/perses-operator-5554d8fd8f-jkhd6" Mar 19 12:16:16.091648 master-0 kubenswrapper[17644]: I0319 12:16:16.091519 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5554d8fd8f-jkhd6" Mar 19 12:16:17.194107 master-0 kubenswrapper[17644]: I0319 12:16:17.193814 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-h2q2d"] Mar 19 12:16:17.271400 master-0 kubenswrapper[17644]: W0319 12:16:17.271224 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0edff6cf_09c0_4eba_81fa_a4e78b150269.slice/crio-8b216819fa2e937891f63ddf7a4f049eb84ab8d29b1d78daedca141dcc368f06 WatchSource:0}: Error finding container 8b216819fa2e937891f63ddf7a4f049eb84ab8d29b1d78daedca141dcc368f06: Status 404 returned error can't find the container with id 8b216819fa2e937891f63ddf7a4f049eb84ab8d29b1d78daedca141dcc368f06 Mar 19 12:16:17.272463 master-0 kubenswrapper[17644]: I0319 12:16:17.272410 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-msc5g"] Mar 19 12:16:17.279122 master-0 kubenswrapper[17644]: I0319 12:16:17.279049 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-dwzhl"] Mar 19 12:16:17.469023 master-0 kubenswrapper[17644]: I0319 12:16:17.466171 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-snwnf"] Mar 19 12:16:17.471608 master-0 kubenswrapper[17644]: I0319 12:16:17.471223 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-84qmh"] Mar 19 12:16:17.482182 master-0 kubenswrapper[17644]: I0319 12:16:17.482134 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5554d8fd8f-jkhd6"] Mar 19 12:16:17.754683 master-0 kubenswrapper[17644]: I0319 12:16:17.754550 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5b47fbc9b4-zf5jl" event={"ID":"e87720f9-bdd2-4397-808c-b51869af7cfe","Type":"ContainerStarted","Data":"691201d2cc00fbe701496e00a01bbabb466b9df4c75af11d08f1300b32afc197"} Mar 19 12:16:17.754899 master-0 kubenswrapper[17644]: I0319 12:16:17.754673 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5b47fbc9b4-zf5jl" Mar 19 12:16:17.755841 master-0 kubenswrapper[17644]: I0319 12:16:17.755797 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5554d8fd8f-jkhd6" event={"ID":"55660b4d-8a68-4062-bc7a-1216d9be2aa3","Type":"ContainerStarted","Data":"78a9f365a6bd1588bbe72094a75c05e47446eb1422839f174db3b60d73d1cbf2"} Mar 19 12:16:17.758642 master-0 kubenswrapper[17644]: I0319 12:16:17.758609 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-qgj2h" event={"ID":"d98f87fe-cea4-41a5-9000-743954979694","Type":"ContainerStarted","Data":"2cc521585af3b74dad9777b5aa99754351965983436bb76b4b41f3253e3f5d35"} Mar 19 12:16:17.758824 master-0 kubenswrapper[17644]: I0319 12:16:17.758787 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-qgj2h" Mar 19 12:16:17.761865 master-0 kubenswrapper[17644]: I0319 12:16:17.759972 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-h2q2d" event={"ID":"6229bbeb-842d-4465-962e-f8148d05cf6f","Type":"ContainerStarted","Data":"d3727a4793ba4b4eeb890543b7184d5572235bcfc84edb82b277f2ac76f532e7"} Mar 19 12:16:17.761865 master-0 kubenswrapper[17644]: I0319 12:16:17.760866 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-dwzhl" event={"ID":"7906718f-0803-4c47-a275-e2e02feb34c3","Type":"ContainerStarted","Data":"0a04b6bfd69717a20ccb697464b66f555986e2abbc93a4af5163fcf96d83db1c"} Mar 19 12:16:17.762355 master-0 kubenswrapper[17644]: I0319 12:16:17.762322 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-rr2cw" event={"ID":"a3fa634e-d357-45b8-b0bd-8ab6b961de7b","Type":"ContainerStarted","Data":"b6fc82d9caf83cd985e45c4a090500c7ba067abf0e1337191d3bce8257885b93"} Mar 19 12:16:17.765100 master-0 kubenswrapper[17644]: I0319 12:16:17.765059 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-snwnf" event={"ID":"56241f4e-fa18-46a7-9be2-5b6d54cd4e26","Type":"ContainerStarted","Data":"0e006251d67d26b3ec2121b147b5c8a53f11ce203e0a88fe622d957d98b8907a"} Mar 19 12:16:17.767069 master-0 kubenswrapper[17644]: I0319 12:16:17.767031 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-84qmh" event={"ID":"58c25e67-f473-47f4-b461-e25d7761c102","Type":"ContainerStarted","Data":"52f72dc4e383c62c7a5919a59d1f885cf95c87c8119a241f1fa35203cb533acd"} Mar 19 12:16:17.767130 master-0 kubenswrapper[17644]: I0319 12:16:17.767070 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-84qmh" event={"ID":"58c25e67-f473-47f4-b461-e25d7761c102","Type":"ContainerStarted","Data":"4357358436fd71c53551fdfbf2abeb84538478c2107bf692e84137cb9f5983f0"} Mar 19 12:16:17.768544 master-0 kubenswrapper[17644]: I0319 12:16:17.768504 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7c8d7d7bcf-bpkwf" event={"ID":"7bed1df9-51fd-4f70-95e9-4ec7333995d1","Type":"ContainerStarted","Data":"3495a3a76516bf3596493b8bfb4aea621d1044110f986f1c0129e74cbfab04bb"} Mar 19 12:16:17.768623 master-0 kubenswrapper[17644]: I0319 12:16:17.768564 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7c8d7d7bcf-bpkwf" Mar 19 12:16:17.769552 master-0 kubenswrapper[17644]: I0319 12:16:17.769520 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-msc5g" event={"ID":"0edff6cf-09c0-4eba-81fa-a4e78b150269","Type":"ContainerStarted","Data":"8b216819fa2e937891f63ddf7a4f049eb84ab8d29b1d78daedca141dcc368f06"} Mar 19 12:16:17.782578 master-0 kubenswrapper[17644]: I0319 12:16:17.782320 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5b47fbc9b4-zf5jl" podStartSLOduration=3.65793058 podStartE2EDuration="16.782290809s" podCreationTimestamp="2026-03-19 12:16:01 +0000 UTC" firstStartedPulling="2026-03-19 12:16:03.394190432 +0000 UTC m=+997.164148467" lastFinishedPulling="2026-03-19 12:16:16.518550651 +0000 UTC m=+1010.288508696" observedRunningTime="2026-03-19 12:16:17.772638984 +0000 UTC m=+1011.542597039" watchObservedRunningTime="2026-03-19 12:16:17.782290809 +0000 UTC m=+1011.552248844" Mar 19 12:16:17.810258 master-0 kubenswrapper[17644]: I0319 12:16:17.808906 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7c8d7d7bcf-bpkwf" podStartSLOduration=2.921742063 podStartE2EDuration="16.808874204s" podCreationTimestamp="2026-03-19 12:16:01 +0000 UTC" firstStartedPulling="2026-03-19 12:16:02.625912907 +0000 UTC m=+996.395870942" lastFinishedPulling="2026-03-19 12:16:16.513045058 +0000 UTC m=+1010.283003083" observedRunningTime="2026-03-19 12:16:17.800260015 +0000 UTC m=+1011.570218060" watchObservedRunningTime="2026-03-19 12:16:17.808874204 +0000 UTC m=+1011.578832249" Mar 19 12:16:17.871135 master-0 kubenswrapper[17644]: I0319 12:16:17.869710 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-qgj2h" podStartSLOduration=4.043387747 podStartE2EDuration="14.869685679s" podCreationTimestamp="2026-03-19 12:16:03 +0000 UTC" firstStartedPulling="2026-03-19 12:16:05.688595531 +0000 UTC m=+999.458553566" lastFinishedPulling="2026-03-19 12:16:16.514893463 +0000 UTC m=+1010.284851498" observedRunningTime="2026-03-19 12:16:17.869549826 +0000 UTC m=+1011.639507881" watchObservedRunningTime="2026-03-19 12:16:17.869685679 +0000 UTC m=+1011.639643724" Mar 19 12:16:17.877862 master-0 kubenswrapper[17644]: I0319 12:16:17.877787 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-rr2cw" podStartSLOduration=2.441612692 podStartE2EDuration="12.877765435s" podCreationTimestamp="2026-03-19 12:16:05 +0000 UTC" firstStartedPulling="2026-03-19 12:16:06.149088317 +0000 UTC m=+999.919046352" lastFinishedPulling="2026-03-19 12:16:16.58524106 +0000 UTC m=+1010.355199095" observedRunningTime="2026-03-19 12:16:17.837755674 +0000 UTC m=+1011.607713719" watchObservedRunningTime="2026-03-19 12:16:17.877765435 +0000 UTC m=+1011.647723480" Mar 19 12:16:17.917709 master-0 kubenswrapper[17644]: I0319 12:16:17.911230 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-84qmh" podStartSLOduration=3.911210427 podStartE2EDuration="3.911210427s" podCreationTimestamp="2026-03-19 12:16:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:16:17.894967303 +0000 UTC m=+1011.664925348" watchObservedRunningTime="2026-03-19 12:16:17.911210427 +0000 UTC m=+1011.681168462" Mar 19 12:16:23.775445 master-0 kubenswrapper[17644]: I0319 12:16:23.775335 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-qgj2h" Mar 19 12:16:26.862067 master-0 kubenswrapper[17644]: I0319 12:16:26.861795 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-h2q2d" event={"ID":"6229bbeb-842d-4465-962e-f8148d05cf6f","Type":"ContainerStarted","Data":"60e25b209e71b8540a4442349c3f96e893f9f352e1529e27af598820f79a4226"} Mar 19 12:16:26.864504 master-0 kubenswrapper[17644]: I0319 12:16:26.863972 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-msc5g" event={"ID":"0edff6cf-09c0-4eba-81fa-a4e78b150269","Type":"ContainerStarted","Data":"c152056cb9247cae482db4bbdbc00443a44999af6541fcd239eb822df8b78a31"} Mar 19 12:16:26.866432 master-0 kubenswrapper[17644]: I0319 12:16:26.866371 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-dwzhl" event={"ID":"7906718f-0803-4c47-a275-e2e02feb34c3","Type":"ContainerStarted","Data":"fbbe8cf254398054c88707d0fec9c2d5fc55c02699190329cc45c31a25d27bb1"} Mar 19 12:16:26.866848 master-0 kubenswrapper[17644]: I0319 12:16:26.866823 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-dwzhl" Mar 19 12:16:26.867839 master-0 kubenswrapper[17644]: I0319 12:16:26.867803 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-snwnf" event={"ID":"56241f4e-fa18-46a7-9be2-5b6d54cd4e26","Type":"ContainerStarted","Data":"341eead8acc98857bd6384e92e53dc6104bc1458e71efacc18677ec2add7eece"} Mar 19 12:16:26.869559 master-0 kubenswrapper[17644]: I0319 12:16:26.869503 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5554d8fd8f-jkhd6" event={"ID":"55660b4d-8a68-4062-bc7a-1216d9be2aa3","Type":"ContainerStarted","Data":"cb58415b1df9a33e3b555f7b8b9f2df3e75efcb949fb048f07c6b1d1416fc4c8"} Mar 19 12:16:26.869805 master-0 kubenswrapper[17644]: I0319 12:16:26.869751 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5554d8fd8f-jkhd6" Mar 19 12:16:26.875441 master-0 kubenswrapper[17644]: I0319 12:16:26.875398 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-6dd7dd855f-dwzhl" Mar 19 12:16:26.890248 master-0 kubenswrapper[17644]: I0319 12:16:26.890130 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-h2q2d" podStartSLOduration=3.752948127 podStartE2EDuration="12.890102046s" podCreationTimestamp="2026-03-19 12:16:14 +0000 UTC" firstStartedPulling="2026-03-19 12:16:17.203370929 +0000 UTC m=+1010.973328954" lastFinishedPulling="2026-03-19 12:16:26.340524848 +0000 UTC m=+1020.110482873" observedRunningTime="2026-03-19 12:16:26.885419521 +0000 UTC m=+1020.655377586" watchObservedRunningTime="2026-03-19 12:16:26.890102046 +0000 UTC m=+1020.660060091" Mar 19 12:16:26.930226 master-0 kubenswrapper[17644]: I0319 12:16:26.930117 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-8ff7d675-msc5g" podStartSLOduration=3.923808331 podStartE2EDuration="12.930095155s" podCreationTimestamp="2026-03-19 12:16:14 +0000 UTC" firstStartedPulling="2026-03-19 12:16:17.292789409 +0000 UTC m=+1011.062747444" lastFinishedPulling="2026-03-19 12:16:26.299076223 +0000 UTC m=+1020.069034268" observedRunningTime="2026-03-19 12:16:26.926230603 +0000 UTC m=+1020.696188658" watchObservedRunningTime="2026-03-19 12:16:26.930095155 +0000 UTC m=+1020.700053190" Mar 19 12:16:26.994761 master-0 kubenswrapper[17644]: I0319 12:16:26.991868 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-6dd7dd855f-dwzhl" podStartSLOduration=2.944130906 podStartE2EDuration="11.991837534s" podCreationTimestamp="2026-03-19 12:16:15 +0000 UTC" firstStartedPulling="2026-03-19 12:16:17.292786089 +0000 UTC m=+1011.062744124" lastFinishedPulling="2026-03-19 12:16:26.340492717 +0000 UTC m=+1020.110450752" observedRunningTime="2026-03-19 12:16:26.975219381 +0000 UTC m=+1020.745177426" watchObservedRunningTime="2026-03-19 12:16:26.991837534 +0000 UTC m=+1020.761795569" Mar 19 12:16:27.033940 master-0 kubenswrapper[17644]: I0319 12:16:27.031517 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67df7f5c48-snwnf" podStartSLOduration=4.202193439 podStartE2EDuration="13.031493367s" podCreationTimestamp="2026-03-19 12:16:14 +0000 UTC" firstStartedPulling="2026-03-19 12:16:17.47166346 +0000 UTC m=+1011.241621495" lastFinishedPulling="2026-03-19 12:16:26.300963378 +0000 UTC m=+1020.070921423" observedRunningTime="2026-03-19 12:16:27.024122747 +0000 UTC m=+1020.794080782" watchObservedRunningTime="2026-03-19 12:16:27.031493367 +0000 UTC m=+1020.801451392" Mar 19 12:16:27.073754 master-0 kubenswrapper[17644]: I0319 12:16:27.072360 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5554d8fd8f-jkhd6" podStartSLOduration=3.220550585 podStartE2EDuration="12.072337208s" podCreationTimestamp="2026-03-19 12:16:15 +0000 UTC" firstStartedPulling="2026-03-19 12:16:17.488757695 +0000 UTC m=+1011.258715730" lastFinishedPulling="2026-03-19 12:16:26.340544318 +0000 UTC m=+1020.110502353" observedRunningTime="2026-03-19 12:16:27.069832147 +0000 UTC m=+1020.839790192" watchObservedRunningTime="2026-03-19 12:16:27.072337208 +0000 UTC m=+1020.842295253" Mar 19 12:16:32.657236 master-0 kubenswrapper[17644]: I0319 12:16:32.657187 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5b47fbc9b4-zf5jl" Mar 19 12:16:36.094143 master-0 kubenswrapper[17644]: I0319 12:16:36.094032 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5554d8fd8f-jkhd6" Mar 19 12:16:51.910399 master-0 kubenswrapper[17644]: I0319 12:16:51.910277 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7c8d7d7bcf-bpkwf" Mar 19 12:17:00.162758 master-0 kubenswrapper[17644]: I0319 12:17:00.162263 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-rj2bt"] Mar 19 12:17:00.167749 master-0 kubenswrapper[17644]: I0319 12:17:00.163405 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rj2bt" Mar 19 12:17:00.167749 master-0 kubenswrapper[17644]: I0319 12:17:00.166646 17644 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 19 12:17:00.204762 master-0 kubenswrapper[17644]: I0319 12:17:00.195300 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-tcwpc"] Mar 19 12:17:00.205435 master-0 kubenswrapper[17644]: I0319 12:17:00.205011 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:00.214755 master-0 kubenswrapper[17644]: I0319 12:17:00.207585 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 19 12:17:00.214755 master-0 kubenswrapper[17644]: I0319 12:17:00.207827 17644 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 19 12:17:00.214755 master-0 kubenswrapper[17644]: I0319 12:17:00.213935 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-rj2bt"] Mar 19 12:17:00.288753 master-0 kubenswrapper[17644]: I0319 12:17:00.288465 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-q82hx"] Mar 19 12:17:00.294744 master-0 kubenswrapper[17644]: I0319 12:17:00.291078 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-q82hx" Mar 19 12:17:00.297492 master-0 kubenswrapper[17644]: I0319 12:17:00.295609 17644 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 19 12:17:00.297492 master-0 kubenswrapper[17644]: I0319 12:17:00.295892 17644 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 19 12:17:00.297492 master-0 kubenswrapper[17644]: I0319 12:17:00.296001 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 19 12:17:00.323060 master-0 kubenswrapper[17644]: I0319 12:17:00.322195 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-46b4x"] Mar 19 12:17:00.323588 master-0 kubenswrapper[17644]: I0319 12:17:00.323550 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-46b4x" Mar 19 12:17:00.327241 master-0 kubenswrapper[17644]: I0319 12:17:00.327127 17644 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 19 12:17:00.330129 master-0 kubenswrapper[17644]: I0319 12:17:00.329848 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-46b4x"] Mar 19 12:17:00.347165 master-0 kubenswrapper[17644]: I0319 12:17:00.346895 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/75793445-a8c5-4cf7-8d0b-561fae8411fe-frr-conf\") pod \"frr-k8s-tcwpc\" (UID: \"75793445-a8c5-4cf7-8d0b-561fae8411fe\") " pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:00.347165 master-0 kubenswrapper[17644]: I0319 12:17:00.346955 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/75793445-a8c5-4cf7-8d0b-561fae8411fe-frr-sockets\") pod \"frr-k8s-tcwpc\" (UID: \"75793445-a8c5-4cf7-8d0b-561fae8411fe\") " pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:00.347165 master-0 kubenswrapper[17644]: I0319 12:17:00.346989 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2lhl\" (UniqueName: \"kubernetes.io/projected/75793445-a8c5-4cf7-8d0b-561fae8411fe-kube-api-access-l2lhl\") pod \"frr-k8s-tcwpc\" (UID: \"75793445-a8c5-4cf7-8d0b-561fae8411fe\") " pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:00.347165 master-0 kubenswrapper[17644]: I0319 12:17:00.347061 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75793445-a8c5-4cf7-8d0b-561fae8411fe-metrics-certs\") pod \"frr-k8s-tcwpc\" (UID: \"75793445-a8c5-4cf7-8d0b-561fae8411fe\") " pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:00.347165 master-0 kubenswrapper[17644]: I0319 12:17:00.347089 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/75793445-a8c5-4cf7-8d0b-561fae8411fe-metrics\") pod \"frr-k8s-tcwpc\" (UID: \"75793445-a8c5-4cf7-8d0b-561fae8411fe\") " pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:00.347165 master-0 kubenswrapper[17644]: I0319 12:17:00.347128 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/75793445-a8c5-4cf7-8d0b-561fae8411fe-reloader\") pod \"frr-k8s-tcwpc\" (UID: \"75793445-a8c5-4cf7-8d0b-561fae8411fe\") " pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:00.347662 master-0 kubenswrapper[17644]: I0319 12:17:00.347211 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c5e6757-8626-4cd1-8736-b41978d173f1-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-rj2bt\" (UID: \"8c5e6757-8626-4cd1-8736-b41978d173f1\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rj2bt" Mar 19 12:17:00.347662 master-0 kubenswrapper[17644]: I0319 12:17:00.347244 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/75793445-a8c5-4cf7-8d0b-561fae8411fe-frr-startup\") pod \"frr-k8s-tcwpc\" (UID: \"75793445-a8c5-4cf7-8d0b-561fae8411fe\") " pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:00.348582 master-0 kubenswrapper[17644]: I0319 12:17:00.348548 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcjjx\" (UniqueName: \"kubernetes.io/projected/8c5e6757-8626-4cd1-8736-b41978d173f1-kube-api-access-zcjjx\") pod \"frr-k8s-webhook-server-bcc4b6f68-rj2bt\" (UID: \"8c5e6757-8626-4cd1-8736-b41978d173f1\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rj2bt" Mar 19 12:17:00.450485 master-0 kubenswrapper[17644]: I0319 12:17:00.450421 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj6l9\" (UniqueName: \"kubernetes.io/projected/304c0dcc-9c40-4bf4-9c05-9d1a4601b15c-kube-api-access-pj6l9\") pod \"controller-7bb4cc7c98-46b4x\" (UID: \"304c0dcc-9c40-4bf4-9c05-9d1a4601b15c\") " pod="metallb-system/controller-7bb4cc7c98-46b4x" Mar 19 12:17:00.450485 master-0 kubenswrapper[17644]: I0319 12:17:00.450486 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fxct\" (UniqueName: \"kubernetes.io/projected/63931868-d8f1-4227-b911-81d786835fbb-kube-api-access-9fxct\") pod \"speaker-q82hx\" (UID: \"63931868-d8f1-4227-b911-81d786835fbb\") " pod="metallb-system/speaker-q82hx" Mar 19 12:17:00.450897 master-0 kubenswrapper[17644]: I0319 12:17:00.450578 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcjjx\" (UniqueName: \"kubernetes.io/projected/8c5e6757-8626-4cd1-8736-b41978d173f1-kube-api-access-zcjjx\") pod \"frr-k8s-webhook-server-bcc4b6f68-rj2bt\" (UID: \"8c5e6757-8626-4cd1-8736-b41978d173f1\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rj2bt" Mar 19 12:17:00.450897 master-0 kubenswrapper[17644]: I0319 12:17:00.450612 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/75793445-a8c5-4cf7-8d0b-561fae8411fe-frr-conf\") pod \"frr-k8s-tcwpc\" (UID: \"75793445-a8c5-4cf7-8d0b-561fae8411fe\") " pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:00.450897 master-0 kubenswrapper[17644]: I0319 12:17:00.450635 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/75793445-a8c5-4cf7-8d0b-561fae8411fe-frr-sockets\") pod \"frr-k8s-tcwpc\" (UID: \"75793445-a8c5-4cf7-8d0b-561fae8411fe\") " pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:00.450897 master-0 kubenswrapper[17644]: I0319 12:17:00.450663 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2lhl\" (UniqueName: \"kubernetes.io/projected/75793445-a8c5-4cf7-8d0b-561fae8411fe-kube-api-access-l2lhl\") pod \"frr-k8s-tcwpc\" (UID: \"75793445-a8c5-4cf7-8d0b-561fae8411fe\") " pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:00.450897 master-0 kubenswrapper[17644]: I0319 12:17:00.450706 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75793445-a8c5-4cf7-8d0b-561fae8411fe-metrics-certs\") pod \"frr-k8s-tcwpc\" (UID: \"75793445-a8c5-4cf7-8d0b-561fae8411fe\") " pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:00.450897 master-0 kubenswrapper[17644]: I0319 12:17:00.450752 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/75793445-a8c5-4cf7-8d0b-561fae8411fe-metrics\") pod \"frr-k8s-tcwpc\" (UID: \"75793445-a8c5-4cf7-8d0b-561fae8411fe\") " pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:00.450897 master-0 kubenswrapper[17644]: I0319 12:17:00.450795 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/75793445-a8c5-4cf7-8d0b-561fae8411fe-reloader\") pod \"frr-k8s-tcwpc\" (UID: \"75793445-a8c5-4cf7-8d0b-561fae8411fe\") " pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:00.450897 master-0 kubenswrapper[17644]: I0319 12:17:00.450821 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/304c0dcc-9c40-4bf4-9c05-9d1a4601b15c-cert\") pod \"controller-7bb4cc7c98-46b4x\" (UID: \"304c0dcc-9c40-4bf4-9c05-9d1a4601b15c\") " pod="metallb-system/controller-7bb4cc7c98-46b4x" Mar 19 12:17:00.450897 master-0 kubenswrapper[17644]: I0319 12:17:00.450863 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63931868-d8f1-4227-b911-81d786835fbb-metrics-certs\") pod \"speaker-q82hx\" (UID: \"63931868-d8f1-4227-b911-81d786835fbb\") " pod="metallb-system/speaker-q82hx" Mar 19 12:17:00.450897 master-0 kubenswrapper[17644]: I0319 12:17:00.450892 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/63931868-d8f1-4227-b911-81d786835fbb-metallb-excludel2\") pod \"speaker-q82hx\" (UID: \"63931868-d8f1-4227-b911-81d786835fbb\") " pod="metallb-system/speaker-q82hx" Mar 19 12:17:00.452050 master-0 kubenswrapper[17644]: I0319 12:17:00.450914 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c5e6757-8626-4cd1-8736-b41978d173f1-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-rj2bt\" (UID: \"8c5e6757-8626-4cd1-8736-b41978d173f1\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rj2bt" Mar 19 12:17:00.452050 master-0 kubenswrapper[17644]: I0319 12:17:00.450939 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/75793445-a8c5-4cf7-8d0b-561fae8411fe-frr-startup\") pod \"frr-k8s-tcwpc\" (UID: \"75793445-a8c5-4cf7-8d0b-561fae8411fe\") " pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:00.452050 master-0 kubenswrapper[17644]: I0319 12:17:00.450955 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/304c0dcc-9c40-4bf4-9c05-9d1a4601b15c-metrics-certs\") pod \"controller-7bb4cc7c98-46b4x\" (UID: \"304c0dcc-9c40-4bf4-9c05-9d1a4601b15c\") " pod="metallb-system/controller-7bb4cc7c98-46b4x" Mar 19 12:17:00.452050 master-0 kubenswrapper[17644]: I0319 12:17:00.450978 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/63931868-d8f1-4227-b911-81d786835fbb-memberlist\") pod \"speaker-q82hx\" (UID: \"63931868-d8f1-4227-b911-81d786835fbb\") " pod="metallb-system/speaker-q82hx" Mar 19 12:17:00.452050 master-0 kubenswrapper[17644]: I0319 12:17:00.451370 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/75793445-a8c5-4cf7-8d0b-561fae8411fe-reloader\") pod \"frr-k8s-tcwpc\" (UID: \"75793445-a8c5-4cf7-8d0b-561fae8411fe\") " pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:00.452050 master-0 kubenswrapper[17644]: I0319 12:17:00.451573 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/75793445-a8c5-4cf7-8d0b-561fae8411fe-frr-conf\") pod \"frr-k8s-tcwpc\" (UID: \"75793445-a8c5-4cf7-8d0b-561fae8411fe\") " pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:00.452050 master-0 kubenswrapper[17644]: I0319 12:17:00.451844 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/75793445-a8c5-4cf7-8d0b-561fae8411fe-frr-sockets\") pod \"frr-k8s-tcwpc\" (UID: \"75793445-a8c5-4cf7-8d0b-561fae8411fe\") " pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:00.453261 master-0 kubenswrapper[17644]: I0319 12:17:00.452518 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/75793445-a8c5-4cf7-8d0b-561fae8411fe-metrics\") pod \"frr-k8s-tcwpc\" (UID: \"75793445-a8c5-4cf7-8d0b-561fae8411fe\") " pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:00.453261 master-0 kubenswrapper[17644]: I0319 12:17:00.453043 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/75793445-a8c5-4cf7-8d0b-561fae8411fe-frr-startup\") pod \"frr-k8s-tcwpc\" (UID: \"75793445-a8c5-4cf7-8d0b-561fae8411fe\") " pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:00.454428 master-0 kubenswrapper[17644]: I0319 12:17:00.454377 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c5e6757-8626-4cd1-8736-b41978d173f1-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-rj2bt\" (UID: \"8c5e6757-8626-4cd1-8736-b41978d173f1\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rj2bt" Mar 19 12:17:00.454825 master-0 kubenswrapper[17644]: I0319 12:17:00.454788 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75793445-a8c5-4cf7-8d0b-561fae8411fe-metrics-certs\") pod \"frr-k8s-tcwpc\" (UID: \"75793445-a8c5-4cf7-8d0b-561fae8411fe\") " pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:00.466807 master-0 kubenswrapper[17644]: I0319 12:17:00.466770 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcjjx\" (UniqueName: \"kubernetes.io/projected/8c5e6757-8626-4cd1-8736-b41978d173f1-kube-api-access-zcjjx\") pod \"frr-k8s-webhook-server-bcc4b6f68-rj2bt\" (UID: \"8c5e6757-8626-4cd1-8736-b41978d173f1\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rj2bt" Mar 19 12:17:00.467905 master-0 kubenswrapper[17644]: I0319 12:17:00.467849 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2lhl\" (UniqueName: \"kubernetes.io/projected/75793445-a8c5-4cf7-8d0b-561fae8411fe-kube-api-access-l2lhl\") pod \"frr-k8s-tcwpc\" (UID: \"75793445-a8c5-4cf7-8d0b-561fae8411fe\") " pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:00.504552 master-0 kubenswrapper[17644]: I0319 12:17:00.504498 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rj2bt" Mar 19 12:17:00.524397 master-0 kubenswrapper[17644]: I0319 12:17:00.523116 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:00.553035 master-0 kubenswrapper[17644]: I0319 12:17:00.552949 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/304c0dcc-9c40-4bf4-9c05-9d1a4601b15c-cert\") pod \"controller-7bb4cc7c98-46b4x\" (UID: \"304c0dcc-9c40-4bf4-9c05-9d1a4601b15c\") " pod="metallb-system/controller-7bb4cc7c98-46b4x" Mar 19 12:17:00.553240 master-0 kubenswrapper[17644]: I0319 12:17:00.553056 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63931868-d8f1-4227-b911-81d786835fbb-metrics-certs\") pod \"speaker-q82hx\" (UID: \"63931868-d8f1-4227-b911-81d786835fbb\") " pod="metallb-system/speaker-q82hx" Mar 19 12:17:00.553240 master-0 kubenswrapper[17644]: I0319 12:17:00.553103 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/63931868-d8f1-4227-b911-81d786835fbb-metallb-excludel2\") pod \"speaker-q82hx\" (UID: \"63931868-d8f1-4227-b911-81d786835fbb\") " pod="metallb-system/speaker-q82hx" Mar 19 12:17:00.553469 master-0 kubenswrapper[17644]: I0319 12:17:00.553440 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/304c0dcc-9c40-4bf4-9c05-9d1a4601b15c-metrics-certs\") pod \"controller-7bb4cc7c98-46b4x\" (UID: \"304c0dcc-9c40-4bf4-9c05-9d1a4601b15c\") " pod="metallb-system/controller-7bb4cc7c98-46b4x" Mar 19 12:17:00.553540 master-0 kubenswrapper[17644]: I0319 12:17:00.553517 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/63931868-d8f1-4227-b911-81d786835fbb-memberlist\") pod \"speaker-q82hx\" (UID: \"63931868-d8f1-4227-b911-81d786835fbb\") " pod="metallb-system/speaker-q82hx" Mar 19 12:17:00.553625 master-0 kubenswrapper[17644]: I0319 12:17:00.553605 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj6l9\" (UniqueName: \"kubernetes.io/projected/304c0dcc-9c40-4bf4-9c05-9d1a4601b15c-kube-api-access-pj6l9\") pod \"controller-7bb4cc7c98-46b4x\" (UID: \"304c0dcc-9c40-4bf4-9c05-9d1a4601b15c\") " pod="metallb-system/controller-7bb4cc7c98-46b4x" Mar 19 12:17:00.553675 master-0 kubenswrapper[17644]: I0319 12:17:00.553663 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fxct\" (UniqueName: \"kubernetes.io/projected/63931868-d8f1-4227-b911-81d786835fbb-kube-api-access-9fxct\") pod \"speaker-q82hx\" (UID: \"63931868-d8f1-4227-b911-81d786835fbb\") " pod="metallb-system/speaker-q82hx" Mar 19 12:17:00.554352 master-0 kubenswrapper[17644]: E0319 12:17:00.554310 17644 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 19 12:17:00.554406 master-0 kubenswrapper[17644]: E0319 12:17:00.554390 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63931868-d8f1-4227-b911-81d786835fbb-memberlist podName:63931868-d8f1-4227-b911-81d786835fbb nodeName:}" failed. No retries permitted until 2026-03-19 12:17:01.054366382 +0000 UTC m=+1054.824324417 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/63931868-d8f1-4227-b911-81d786835fbb-memberlist") pod "speaker-q82hx" (UID: "63931868-d8f1-4227-b911-81d786835fbb") : secret "metallb-memberlist" not found Mar 19 12:17:00.554778 master-0 kubenswrapper[17644]: I0319 12:17:00.554719 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/63931868-d8f1-4227-b911-81d786835fbb-metallb-excludel2\") pod \"speaker-q82hx\" (UID: \"63931868-d8f1-4227-b911-81d786835fbb\") " pod="metallb-system/speaker-q82hx" Mar 19 12:17:00.556604 master-0 kubenswrapper[17644]: I0319 12:17:00.556576 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63931868-d8f1-4227-b911-81d786835fbb-metrics-certs\") pod \"speaker-q82hx\" (UID: \"63931868-d8f1-4227-b911-81d786835fbb\") " pod="metallb-system/speaker-q82hx" Mar 19 12:17:00.556889 master-0 kubenswrapper[17644]: I0319 12:17:00.556863 17644 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 19 12:17:00.558601 master-0 kubenswrapper[17644]: I0319 12:17:00.558568 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/304c0dcc-9c40-4bf4-9c05-9d1a4601b15c-metrics-certs\") pod \"controller-7bb4cc7c98-46b4x\" (UID: \"304c0dcc-9c40-4bf4-9c05-9d1a4601b15c\") " pod="metallb-system/controller-7bb4cc7c98-46b4x" Mar 19 12:17:00.570839 master-0 kubenswrapper[17644]: I0319 12:17:00.567684 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/304c0dcc-9c40-4bf4-9c05-9d1a4601b15c-cert\") pod \"controller-7bb4cc7c98-46b4x\" (UID: \"304c0dcc-9c40-4bf4-9c05-9d1a4601b15c\") " pod="metallb-system/controller-7bb4cc7c98-46b4x" Mar 19 12:17:00.574299 master-0 kubenswrapper[17644]: I0319 12:17:00.574264 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj6l9\" (UniqueName: \"kubernetes.io/projected/304c0dcc-9c40-4bf4-9c05-9d1a4601b15c-kube-api-access-pj6l9\") pod \"controller-7bb4cc7c98-46b4x\" (UID: \"304c0dcc-9c40-4bf4-9c05-9d1a4601b15c\") " pod="metallb-system/controller-7bb4cc7c98-46b4x" Mar 19 12:17:00.575600 master-0 kubenswrapper[17644]: I0319 12:17:00.575570 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fxct\" (UniqueName: \"kubernetes.io/projected/63931868-d8f1-4227-b911-81d786835fbb-kube-api-access-9fxct\") pod \"speaker-q82hx\" (UID: \"63931868-d8f1-4227-b911-81d786835fbb\") " pod="metallb-system/speaker-q82hx" Mar 19 12:17:00.642604 master-0 kubenswrapper[17644]: I0319 12:17:00.642007 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-46b4x" Mar 19 12:17:00.933850 master-0 kubenswrapper[17644]: I0319 12:17:00.933786 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-rj2bt"] Mar 19 12:17:01.054016 master-0 kubenswrapper[17644]: I0319 12:17:01.053957 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-46b4x"] Mar 19 12:17:01.057999 master-0 kubenswrapper[17644]: W0319 12:17:01.057910 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod304c0dcc_9c40_4bf4_9c05_9d1a4601b15c.slice/crio-36416ff081ef01daf86f5f2795d0afdbd319bb987d37892268d22bc6b90e7185 WatchSource:0}: Error finding container 36416ff081ef01daf86f5f2795d0afdbd319bb987d37892268d22bc6b90e7185: Status 404 returned error can't find the container with id 36416ff081ef01daf86f5f2795d0afdbd319bb987d37892268d22bc6b90e7185 Mar 19 12:17:01.064368 master-0 kubenswrapper[17644]: I0319 12:17:01.064335 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/63931868-d8f1-4227-b911-81d786835fbb-memberlist\") pod \"speaker-q82hx\" (UID: \"63931868-d8f1-4227-b911-81d786835fbb\") " pod="metallb-system/speaker-q82hx" Mar 19 12:17:01.064599 master-0 kubenswrapper[17644]: E0319 12:17:01.064570 17644 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 19 12:17:01.064650 master-0 kubenswrapper[17644]: E0319 12:17:01.064622 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63931868-d8f1-4227-b911-81d786835fbb-memberlist podName:63931868-d8f1-4227-b911-81d786835fbb nodeName:}" failed. No retries permitted until 2026-03-19 12:17:02.064604815 +0000 UTC m=+1055.834562850 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/63931868-d8f1-4227-b911-81d786835fbb-memberlist") pod "speaker-q82hx" (UID: "63931868-d8f1-4227-b911-81d786835fbb") : secret "metallb-memberlist" not found Mar 19 12:17:01.175347 master-0 kubenswrapper[17644]: I0319 12:17:01.175287 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tcwpc" event={"ID":"75793445-a8c5-4cf7-8d0b-561fae8411fe","Type":"ContainerStarted","Data":"ebe8f74bf194e47c424a34c3efc96ad78e525473e6d4620a68292240418745e3"} Mar 19 12:17:01.176892 master-0 kubenswrapper[17644]: I0319 12:17:01.176835 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-46b4x" event={"ID":"304c0dcc-9c40-4bf4-9c05-9d1a4601b15c","Type":"ContainerStarted","Data":"786e5bc7d405d5bde400790d7e94b0ef5dcecd90bb54c061ae6553b54cb46127"} Mar 19 12:17:01.176961 master-0 kubenswrapper[17644]: I0319 12:17:01.176895 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-46b4x" event={"ID":"304c0dcc-9c40-4bf4-9c05-9d1a4601b15c","Type":"ContainerStarted","Data":"36416ff081ef01daf86f5f2795d0afdbd319bb987d37892268d22bc6b90e7185"} Mar 19 12:17:01.177830 master-0 kubenswrapper[17644]: I0319 12:17:01.177789 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rj2bt" event={"ID":"8c5e6757-8626-4cd1-8736-b41978d173f1","Type":"ContainerStarted","Data":"14dd125ffa93073fc2939f99631629875d1d90cbf8212f2cbc408f4700212b86"} Mar 19 12:17:02.084490 master-0 kubenswrapper[17644]: I0319 12:17:02.084431 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/63931868-d8f1-4227-b911-81d786835fbb-memberlist\") pod \"speaker-q82hx\" (UID: \"63931868-d8f1-4227-b911-81d786835fbb\") " pod="metallb-system/speaker-q82hx" Mar 19 12:17:02.087704 master-0 kubenswrapper[17644]: I0319 12:17:02.087665 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/63931868-d8f1-4227-b911-81d786835fbb-memberlist\") pod \"speaker-q82hx\" (UID: \"63931868-d8f1-4227-b911-81d786835fbb\") " pod="metallb-system/speaker-q82hx" Mar 19 12:17:02.112027 master-0 kubenswrapper[17644]: I0319 12:17:02.111971 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-q82hx" Mar 19 12:17:02.188774 master-0 kubenswrapper[17644]: I0319 12:17:02.188703 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-q82hx" event={"ID":"63931868-d8f1-4227-b911-81d786835fbb","Type":"ContainerStarted","Data":"292a4543494df094ccddef01cee7bf4f45782e802b6540ac3786813f13d73947"} Mar 19 12:17:02.417412 master-0 kubenswrapper[17644]: I0319 12:17:02.416556 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-7hz8v"] Mar 19 12:17:02.417972 master-0 kubenswrapper[17644]: I0319 12:17:02.417917 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-7hz8v" Mar 19 12:17:02.432629 master-0 kubenswrapper[17644]: I0319 12:17:02.431406 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-94bfp"] Mar 19 12:17:02.436556 master-0 kubenswrapper[17644]: I0319 12:17:02.434120 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-94bfp" Mar 19 12:17:02.440122 master-0 kubenswrapper[17644]: I0319 12:17:02.440080 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 19 12:17:02.448202 master-0 kubenswrapper[17644]: I0319 12:17:02.448144 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-7hz8v"] Mar 19 12:17:02.467154 master-0 kubenswrapper[17644]: I0319 12:17:02.465789 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-94bfp"] Mar 19 12:17:02.469937 master-0 kubenswrapper[17644]: I0319 12:17:02.469777 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-pbgxp"] Mar 19 12:17:02.471166 master-0 kubenswrapper[17644]: I0319 12:17:02.471141 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-pbgxp" Mar 19 12:17:02.596271 master-0 kubenswrapper[17644]: I0319 12:17:02.596207 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv9jw\" (UniqueName: \"kubernetes.io/projected/8d8c7433-5218-4713-9e76-1c94175acd1c-kube-api-access-sv9jw\") pod \"nmstate-webhook-5f558f5558-94bfp\" (UID: \"8d8c7433-5218-4713-9e76-1c94175acd1c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-94bfp" Mar 19 12:17:02.596496 master-0 kubenswrapper[17644]: I0319 12:17:02.596355 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hxvh\" (UniqueName: \"kubernetes.io/projected/defe43e5-1621-42e7-9e79-bc48c2bbfb5c-kube-api-access-8hxvh\") pod \"nmstate-metrics-9b8c8685d-7hz8v\" (UID: \"defe43e5-1621-42e7-9e79-bc48c2bbfb5c\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-7hz8v" Mar 19 12:17:02.596496 master-0 kubenswrapper[17644]: I0319 12:17:02.596482 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8d8c7433-5218-4713-9e76-1c94175acd1c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-94bfp\" (UID: \"8d8c7433-5218-4713-9e76-1c94175acd1c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-94bfp" Mar 19 12:17:02.600359 master-0 kubenswrapper[17644]: I0319 12:17:02.599586 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xk2k\" (UniqueName: \"kubernetes.io/projected/e6f7e8f5-8cca-400e-9eea-2961d3f9920f-kube-api-access-4xk2k\") pod \"nmstate-handler-pbgxp\" (UID: \"e6f7e8f5-8cca-400e-9eea-2961d3f9920f\") " pod="openshift-nmstate/nmstate-handler-pbgxp" Mar 19 12:17:02.600359 master-0 kubenswrapper[17644]: I0319 12:17:02.599667 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e6f7e8f5-8cca-400e-9eea-2961d3f9920f-nmstate-lock\") pod \"nmstate-handler-pbgxp\" (UID: \"e6f7e8f5-8cca-400e-9eea-2961d3f9920f\") " pod="openshift-nmstate/nmstate-handler-pbgxp" Mar 19 12:17:02.600359 master-0 kubenswrapper[17644]: I0319 12:17:02.599715 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e6f7e8f5-8cca-400e-9eea-2961d3f9920f-dbus-socket\") pod \"nmstate-handler-pbgxp\" (UID: \"e6f7e8f5-8cca-400e-9eea-2961d3f9920f\") " pod="openshift-nmstate/nmstate-handler-pbgxp" Mar 19 12:17:02.600359 master-0 kubenswrapper[17644]: I0319 12:17:02.599851 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e6f7e8f5-8cca-400e-9eea-2961d3f9920f-ovs-socket\") pod \"nmstate-handler-pbgxp\" (UID: \"e6f7e8f5-8cca-400e-9eea-2961d3f9920f\") " pod="openshift-nmstate/nmstate-handler-pbgxp" Mar 19 12:17:02.662894 master-0 kubenswrapper[17644]: I0319 12:17:02.661169 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-5v79v"] Mar 19 12:17:02.662894 master-0 kubenswrapper[17644]: I0319 12:17:02.662340 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5v79v" Mar 19 12:17:02.671776 master-0 kubenswrapper[17644]: I0319 12:17:02.671712 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-5v79v"] Mar 19 12:17:02.676777 master-0 kubenswrapper[17644]: I0319 12:17:02.676627 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 19 12:17:02.676902 master-0 kubenswrapper[17644]: I0319 12:17:02.676838 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 19 12:17:02.700896 master-0 kubenswrapper[17644]: I0319 12:17:02.700831 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e6f7e8f5-8cca-400e-9eea-2961d3f9920f-ovs-socket\") pod \"nmstate-handler-pbgxp\" (UID: \"e6f7e8f5-8cca-400e-9eea-2961d3f9920f\") " pod="openshift-nmstate/nmstate-handler-pbgxp" Mar 19 12:17:02.700896 master-0 kubenswrapper[17644]: I0319 12:17:02.700892 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv9jw\" (UniqueName: \"kubernetes.io/projected/8d8c7433-5218-4713-9e76-1c94175acd1c-kube-api-access-sv9jw\") pod \"nmstate-webhook-5f558f5558-94bfp\" (UID: \"8d8c7433-5218-4713-9e76-1c94175acd1c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-94bfp" Mar 19 12:17:02.701159 master-0 kubenswrapper[17644]: I0319 12:17:02.700944 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gns2q\" (UniqueName: \"kubernetes.io/projected/bda90bb9-a85d-4dba-b00b-7721557694bc-kube-api-access-gns2q\") pod \"nmstate-console-plugin-86f58fcf4-5v79v\" (UID: \"bda90bb9-a85d-4dba-b00b-7721557694bc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5v79v" Mar 19 12:17:02.701159 master-0 kubenswrapper[17644]: I0319 12:17:02.700983 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hxvh\" (UniqueName: \"kubernetes.io/projected/defe43e5-1621-42e7-9e79-bc48c2bbfb5c-kube-api-access-8hxvh\") pod \"nmstate-metrics-9b8c8685d-7hz8v\" (UID: \"defe43e5-1621-42e7-9e79-bc48c2bbfb5c\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-7hz8v" Mar 19 12:17:02.701242 master-0 kubenswrapper[17644]: I0319 12:17:02.701168 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8d8c7433-5218-4713-9e76-1c94175acd1c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-94bfp\" (UID: \"8d8c7433-5218-4713-9e76-1c94175acd1c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-94bfp" Mar 19 12:17:02.701242 master-0 kubenswrapper[17644]: I0319 12:17:02.701225 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xk2k\" (UniqueName: \"kubernetes.io/projected/e6f7e8f5-8cca-400e-9eea-2961d3f9920f-kube-api-access-4xk2k\") pod \"nmstate-handler-pbgxp\" (UID: \"e6f7e8f5-8cca-400e-9eea-2961d3f9920f\") " pod="openshift-nmstate/nmstate-handler-pbgxp" Mar 19 12:17:02.701311 master-0 kubenswrapper[17644]: I0319 12:17:02.701246 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e6f7e8f5-8cca-400e-9eea-2961d3f9920f-nmstate-lock\") pod \"nmstate-handler-pbgxp\" (UID: \"e6f7e8f5-8cca-400e-9eea-2961d3f9920f\") " pod="openshift-nmstate/nmstate-handler-pbgxp" Mar 19 12:17:02.701311 master-0 kubenswrapper[17644]: I0319 12:17:02.701264 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bda90bb9-a85d-4dba-b00b-7721557694bc-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-5v79v\" (UID: \"bda90bb9-a85d-4dba-b00b-7721557694bc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5v79v" Mar 19 12:17:02.701422 master-0 kubenswrapper[17644]: I0319 12:17:02.701372 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e6f7e8f5-8cca-400e-9eea-2961d3f9920f-nmstate-lock\") pod \"nmstate-handler-pbgxp\" (UID: \"e6f7e8f5-8cca-400e-9eea-2961d3f9920f\") " pod="openshift-nmstate/nmstate-handler-pbgxp" Mar 19 12:17:02.701609 master-0 kubenswrapper[17644]: E0319 12:17:02.701581 17644 secret.go:189] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 19 12:17:02.701754 master-0 kubenswrapper[17644]: E0319 12:17:02.701741 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d8c7433-5218-4713-9e76-1c94175acd1c-tls-key-pair podName:8d8c7433-5218-4713-9e76-1c94175acd1c nodeName:}" failed. No retries permitted until 2026-03-19 12:17:03.201705113 +0000 UTC m=+1056.971663148 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/8d8c7433-5218-4713-9e76-1c94175acd1c-tls-key-pair") pod "nmstate-webhook-5f558f5558-94bfp" (UID: "8d8c7433-5218-4713-9e76-1c94175acd1c") : secret "openshift-nmstate-webhook" not found Mar 19 12:17:02.701878 master-0 kubenswrapper[17644]: I0319 12:17:02.701857 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e6f7e8f5-8cca-400e-9eea-2961d3f9920f-ovs-socket\") pod \"nmstate-handler-pbgxp\" (UID: \"e6f7e8f5-8cca-400e-9eea-2961d3f9920f\") " pod="openshift-nmstate/nmstate-handler-pbgxp" Mar 19 12:17:02.704425 master-0 kubenswrapper[17644]: I0319 12:17:02.701943 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e6f7e8f5-8cca-400e-9eea-2961d3f9920f-dbus-socket\") pod \"nmstate-handler-pbgxp\" (UID: \"e6f7e8f5-8cca-400e-9eea-2961d3f9920f\") " pod="openshift-nmstate/nmstate-handler-pbgxp" Mar 19 12:17:02.704425 master-0 kubenswrapper[17644]: I0319 12:17:02.701990 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bda90bb9-a85d-4dba-b00b-7721557694bc-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-5v79v\" (UID: \"bda90bb9-a85d-4dba-b00b-7721557694bc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5v79v" Mar 19 12:17:02.704425 master-0 kubenswrapper[17644]: I0319 12:17:02.702177 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e6f7e8f5-8cca-400e-9eea-2961d3f9920f-dbus-socket\") pod \"nmstate-handler-pbgxp\" (UID: \"e6f7e8f5-8cca-400e-9eea-2961d3f9920f\") " pod="openshift-nmstate/nmstate-handler-pbgxp" Mar 19 12:17:02.718317 master-0 kubenswrapper[17644]: I0319 12:17:02.718273 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv9jw\" (UniqueName: \"kubernetes.io/projected/8d8c7433-5218-4713-9e76-1c94175acd1c-kube-api-access-sv9jw\") pod \"nmstate-webhook-5f558f5558-94bfp\" (UID: \"8d8c7433-5218-4713-9e76-1c94175acd1c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-94bfp" Mar 19 12:17:02.719140 master-0 kubenswrapper[17644]: I0319 12:17:02.719114 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hxvh\" (UniqueName: \"kubernetes.io/projected/defe43e5-1621-42e7-9e79-bc48c2bbfb5c-kube-api-access-8hxvh\") pod \"nmstate-metrics-9b8c8685d-7hz8v\" (UID: \"defe43e5-1621-42e7-9e79-bc48c2bbfb5c\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-7hz8v" Mar 19 12:17:02.719508 master-0 kubenswrapper[17644]: I0319 12:17:02.719489 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xk2k\" (UniqueName: \"kubernetes.io/projected/e6f7e8f5-8cca-400e-9eea-2961d3f9920f-kube-api-access-4xk2k\") pod \"nmstate-handler-pbgxp\" (UID: \"e6f7e8f5-8cca-400e-9eea-2961d3f9920f\") " pod="openshift-nmstate/nmstate-handler-pbgxp" Mar 19 12:17:02.803640 master-0 kubenswrapper[17644]: I0319 12:17:02.803578 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bda90bb9-a85d-4dba-b00b-7721557694bc-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-5v79v\" (UID: \"bda90bb9-a85d-4dba-b00b-7721557694bc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5v79v" Mar 19 12:17:02.803640 master-0 kubenswrapper[17644]: I0319 12:17:02.803635 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bda90bb9-a85d-4dba-b00b-7721557694bc-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-5v79v\" (UID: \"bda90bb9-a85d-4dba-b00b-7721557694bc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5v79v" Mar 19 12:17:02.803911 master-0 kubenswrapper[17644]: I0319 12:17:02.803695 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gns2q\" (UniqueName: \"kubernetes.io/projected/bda90bb9-a85d-4dba-b00b-7721557694bc-kube-api-access-gns2q\") pod \"nmstate-console-plugin-86f58fcf4-5v79v\" (UID: \"bda90bb9-a85d-4dba-b00b-7721557694bc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5v79v" Mar 19 12:17:02.805055 master-0 kubenswrapper[17644]: E0319 12:17:02.805001 17644 secret.go:189] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 19 12:17:02.805221 master-0 kubenswrapper[17644]: E0319 12:17:02.805207 17644 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bda90bb9-a85d-4dba-b00b-7721557694bc-plugin-serving-cert podName:bda90bb9-a85d-4dba-b00b-7721557694bc nodeName:}" failed. No retries permitted until 2026-03-19 12:17:03.305179914 +0000 UTC m=+1057.075138009 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/bda90bb9-a85d-4dba-b00b-7721557694bc-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-5v79v" (UID: "bda90bb9-a85d-4dba-b00b-7721557694bc") : secret "plugin-serving-cert" not found Mar 19 12:17:02.811613 master-0 kubenswrapper[17644]: I0319 12:17:02.811564 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bda90bb9-a85d-4dba-b00b-7721557694bc-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-5v79v\" (UID: \"bda90bb9-a85d-4dba-b00b-7721557694bc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5v79v" Mar 19 12:17:02.849631 master-0 kubenswrapper[17644]: I0319 12:17:02.849534 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-7hz8v" Mar 19 12:17:02.879601 master-0 kubenswrapper[17644]: I0319 12:17:02.879530 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-pbgxp" Mar 19 12:17:02.959706 master-0 kubenswrapper[17644]: I0319 12:17:02.959484 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gns2q\" (UniqueName: \"kubernetes.io/projected/bda90bb9-a85d-4dba-b00b-7721557694bc-kube-api-access-gns2q\") pod \"nmstate-console-plugin-86f58fcf4-5v79v\" (UID: \"bda90bb9-a85d-4dba-b00b-7721557694bc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5v79v" Mar 19 12:17:03.025309 master-0 kubenswrapper[17644]: I0319 12:17:03.023019 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-657ddb758d-h2zns"] Mar 19 12:17:03.025309 master-0 kubenswrapper[17644]: I0319 12:17:03.024540 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:03.045871 master-0 kubenswrapper[17644]: I0319 12:17:03.044236 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-657ddb758d-h2zns"] Mar 19 12:17:03.202649 master-0 kubenswrapper[17644]: I0319 12:17:03.202269 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-q82hx" event={"ID":"63931868-d8f1-4227-b911-81d786835fbb","Type":"ContainerStarted","Data":"483a9ddcd951aee48fd90a8b25acd74985e40da3fcca63a87acca0b33133c188"} Mar 19 12:17:03.205659 master-0 kubenswrapper[17644]: I0319 12:17:03.205593 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-46b4x" event={"ID":"304c0dcc-9c40-4bf4-9c05-9d1a4601b15c","Type":"ContainerStarted","Data":"60fc67654cc09c1575cd618f292545a372072e6b4a8d0628ff4fe308f813e449"} Mar 19 12:17:03.205960 master-0 kubenswrapper[17644]: I0319 12:17:03.205909 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-46b4x" Mar 19 12:17:03.209794 master-0 kubenswrapper[17644]: I0319 12:17:03.209752 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-pbgxp" event={"ID":"e6f7e8f5-8cca-400e-9eea-2961d3f9920f","Type":"ContainerStarted","Data":"de8bec3ec12373a7de1960f87e8f873442834899e2654d67054d5cad412173e8"} Mar 19 12:17:03.214581 master-0 kubenswrapper[17644]: I0319 12:17:03.214511 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db5ffad0-a78e-4f97-a915-39bf347b53ca-trusted-ca-bundle\") pod \"console-657ddb758d-h2zns\" (UID: \"db5ffad0-a78e-4f97-a915-39bf347b53ca\") " pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:03.214581 master-0 kubenswrapper[17644]: I0319 12:17:03.214579 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db5ffad0-a78e-4f97-a915-39bf347b53ca-console-config\") pod \"console-657ddb758d-h2zns\" (UID: \"db5ffad0-a78e-4f97-a915-39bf347b53ca\") " pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:03.214896 master-0 kubenswrapper[17644]: I0319 12:17:03.214603 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db5ffad0-a78e-4f97-a915-39bf347b53ca-console-oauth-config\") pod \"console-657ddb758d-h2zns\" (UID: \"db5ffad0-a78e-4f97-a915-39bf347b53ca\") " pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:03.214896 master-0 kubenswrapper[17644]: I0319 12:17:03.214627 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db5ffad0-a78e-4f97-a915-39bf347b53ca-service-ca\") pod \"console-657ddb758d-h2zns\" (UID: \"db5ffad0-a78e-4f97-a915-39bf347b53ca\") " pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:03.214896 master-0 kubenswrapper[17644]: I0319 12:17:03.214664 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fj95\" (UniqueName: \"kubernetes.io/projected/db5ffad0-a78e-4f97-a915-39bf347b53ca-kube-api-access-9fj95\") pod \"console-657ddb758d-h2zns\" (UID: \"db5ffad0-a78e-4f97-a915-39bf347b53ca\") " pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:03.214896 master-0 kubenswrapper[17644]: I0319 12:17:03.214689 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db5ffad0-a78e-4f97-a915-39bf347b53ca-console-serving-cert\") pod \"console-657ddb758d-h2zns\" (UID: \"db5ffad0-a78e-4f97-a915-39bf347b53ca\") " pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:03.214896 master-0 kubenswrapper[17644]: I0319 12:17:03.214740 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8d8c7433-5218-4713-9e76-1c94175acd1c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-94bfp\" (UID: \"8d8c7433-5218-4713-9e76-1c94175acd1c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-94bfp" Mar 19 12:17:03.214896 master-0 kubenswrapper[17644]: I0319 12:17:03.214763 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db5ffad0-a78e-4f97-a915-39bf347b53ca-oauth-serving-cert\") pod \"console-657ddb758d-h2zns\" (UID: \"db5ffad0-a78e-4f97-a915-39bf347b53ca\") " pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:03.251365 master-0 kubenswrapper[17644]: I0319 12:17:03.251231 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-46b4x" podStartSLOduration=1.401820228 podStartE2EDuration="3.251208118s" podCreationTimestamp="2026-03-19 12:17:00 +0000 UTC" firstStartedPulling="2026-03-19 12:17:01.163608037 +0000 UTC m=+1054.933566072" lastFinishedPulling="2026-03-19 12:17:03.012995927 +0000 UTC m=+1056.782953962" observedRunningTime="2026-03-19 12:17:03.241247186 +0000 UTC m=+1057.011205241" watchObservedRunningTime="2026-03-19 12:17:03.251208118 +0000 UTC m=+1057.021166143" Mar 19 12:17:03.256022 master-0 kubenswrapper[17644]: I0319 12:17:03.255978 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8d8c7433-5218-4713-9e76-1c94175acd1c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-94bfp\" (UID: \"8d8c7433-5218-4713-9e76-1c94175acd1c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-94bfp" Mar 19 12:17:03.317760 master-0 kubenswrapper[17644]: I0319 12:17:03.316014 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bda90bb9-a85d-4dba-b00b-7721557694bc-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-5v79v\" (UID: \"bda90bb9-a85d-4dba-b00b-7721557694bc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5v79v" Mar 19 12:17:03.317760 master-0 kubenswrapper[17644]: I0319 12:17:03.316096 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db5ffad0-a78e-4f97-a915-39bf347b53ca-trusted-ca-bundle\") pod \"console-657ddb758d-h2zns\" (UID: \"db5ffad0-a78e-4f97-a915-39bf347b53ca\") " pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:03.317760 master-0 kubenswrapper[17644]: I0319 12:17:03.316141 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db5ffad0-a78e-4f97-a915-39bf347b53ca-console-config\") pod \"console-657ddb758d-h2zns\" (UID: \"db5ffad0-a78e-4f97-a915-39bf347b53ca\") " pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:03.317760 master-0 kubenswrapper[17644]: I0319 12:17:03.316162 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db5ffad0-a78e-4f97-a915-39bf347b53ca-console-oauth-config\") pod \"console-657ddb758d-h2zns\" (UID: \"db5ffad0-a78e-4f97-a915-39bf347b53ca\") " pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:03.317760 master-0 kubenswrapper[17644]: I0319 12:17:03.316309 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db5ffad0-a78e-4f97-a915-39bf347b53ca-service-ca\") pod \"console-657ddb758d-h2zns\" (UID: \"db5ffad0-a78e-4f97-a915-39bf347b53ca\") " pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:03.317760 master-0 kubenswrapper[17644]: I0319 12:17:03.316381 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fj95\" (UniqueName: \"kubernetes.io/projected/db5ffad0-a78e-4f97-a915-39bf347b53ca-kube-api-access-9fj95\") pod \"console-657ddb758d-h2zns\" (UID: \"db5ffad0-a78e-4f97-a915-39bf347b53ca\") " pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:03.317760 master-0 kubenswrapper[17644]: I0319 12:17:03.316689 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db5ffad0-a78e-4f97-a915-39bf347b53ca-console-serving-cert\") pod \"console-657ddb758d-h2zns\" (UID: \"db5ffad0-a78e-4f97-a915-39bf347b53ca\") " pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:03.317760 master-0 kubenswrapper[17644]: I0319 12:17:03.316792 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db5ffad0-a78e-4f97-a915-39bf347b53ca-oauth-serving-cert\") pod \"console-657ddb758d-h2zns\" (UID: \"db5ffad0-a78e-4f97-a915-39bf347b53ca\") " pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:03.317760 master-0 kubenswrapper[17644]: I0319 12:17:03.317209 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db5ffad0-a78e-4f97-a915-39bf347b53ca-console-config\") pod \"console-657ddb758d-h2zns\" (UID: \"db5ffad0-a78e-4f97-a915-39bf347b53ca\") " pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:03.318313 master-0 kubenswrapper[17644]: I0319 12:17:03.317872 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db5ffad0-a78e-4f97-a915-39bf347b53ca-trusted-ca-bundle\") pod \"console-657ddb758d-h2zns\" (UID: \"db5ffad0-a78e-4f97-a915-39bf347b53ca\") " pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:03.320399 master-0 kubenswrapper[17644]: I0319 12:17:03.320196 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db5ffad0-a78e-4f97-a915-39bf347b53ca-oauth-serving-cert\") pod \"console-657ddb758d-h2zns\" (UID: \"db5ffad0-a78e-4f97-a915-39bf347b53ca\") " pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:03.321095 master-0 kubenswrapper[17644]: I0319 12:17:03.321077 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db5ffad0-a78e-4f97-a915-39bf347b53ca-service-ca\") pod \"console-657ddb758d-h2zns\" (UID: \"db5ffad0-a78e-4f97-a915-39bf347b53ca\") " pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:03.327491 master-0 kubenswrapper[17644]: I0319 12:17:03.327442 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db5ffad0-a78e-4f97-a915-39bf347b53ca-console-oauth-config\") pod \"console-657ddb758d-h2zns\" (UID: \"db5ffad0-a78e-4f97-a915-39bf347b53ca\") " pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:03.328209 master-0 kubenswrapper[17644]: I0319 12:17:03.327507 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bda90bb9-a85d-4dba-b00b-7721557694bc-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-5v79v\" (UID: \"bda90bb9-a85d-4dba-b00b-7721557694bc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5v79v" Mar 19 12:17:03.328473 master-0 kubenswrapper[17644]: I0319 12:17:03.328451 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db5ffad0-a78e-4f97-a915-39bf347b53ca-console-serving-cert\") pod \"console-657ddb758d-h2zns\" (UID: \"db5ffad0-a78e-4f97-a915-39bf347b53ca\") " pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:03.338286 master-0 kubenswrapper[17644]: I0319 12:17:03.338264 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fj95\" (UniqueName: \"kubernetes.io/projected/db5ffad0-a78e-4f97-a915-39bf347b53ca-kube-api-access-9fj95\") pod \"console-657ddb758d-h2zns\" (UID: \"db5ffad0-a78e-4f97-a915-39bf347b53ca\") " pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:03.379963 master-0 kubenswrapper[17644]: I0319 12:17:03.379846 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:03.440564 master-0 kubenswrapper[17644]: W0319 12:17:03.440510 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddefe43e5_1621_42e7_9e79_bc48c2bbfb5c.slice/crio-5105bc4396c92880b50e453c30dbd323293b0869ad5135cf1a9c0954cd009e20 WatchSource:0}: Error finding container 5105bc4396c92880b50e453c30dbd323293b0869ad5135cf1a9c0954cd009e20: Status 404 returned error can't find the container with id 5105bc4396c92880b50e453c30dbd323293b0869ad5135cf1a9c0954cd009e20 Mar 19 12:17:03.442305 master-0 kubenswrapper[17644]: I0319 12:17:03.442268 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-7hz8v"] Mar 19 12:17:03.467346 master-0 kubenswrapper[17644]: I0319 12:17:03.467214 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-94bfp" Mar 19 12:17:03.597910 master-0 kubenswrapper[17644]: I0319 12:17:03.597422 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5v79v" Mar 19 12:17:03.995067 master-0 kubenswrapper[17644]: I0319 12:17:03.994167 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-657ddb758d-h2zns"] Mar 19 12:17:03.996483 master-0 kubenswrapper[17644]: W0319 12:17:03.996391 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb5ffad0_a78e_4f97_a915_39bf347b53ca.slice/crio-490c5fedfe815e4c3eb5bc528a4b7b33f5cbfcb5b505fc7651be063b816bd4e4 WatchSource:0}: Error finding container 490c5fedfe815e4c3eb5bc528a4b7b33f5cbfcb5b505fc7651be063b816bd4e4: Status 404 returned error can't find the container with id 490c5fedfe815e4c3eb5bc528a4b7b33f5cbfcb5b505fc7651be063b816bd4e4 Mar 19 12:17:04.027089 master-0 kubenswrapper[17644]: I0319 12:17:04.027018 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-94bfp"] Mar 19 12:17:04.043901 master-0 kubenswrapper[17644]: W0319 12:17:04.043833 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d8c7433_5218_4713_9e76_1c94175acd1c.slice/crio-2454538ce9d37f7a2af95e990823cc9e92f1545dbe49e5652b1c545cfce3833a WatchSource:0}: Error finding container 2454538ce9d37f7a2af95e990823cc9e92f1545dbe49e5652b1c545cfce3833a: Status 404 returned error can't find the container with id 2454538ce9d37f7a2af95e990823cc9e92f1545dbe49e5652b1c545cfce3833a Mar 19 12:17:04.089712 master-0 kubenswrapper[17644]: W0319 12:17:04.085420 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbda90bb9_a85d_4dba_b00b_7721557694bc.slice/crio-aa5d92b44dfced168769cce301e31ddc55675c8df921051bc20c57aeebe7d9fe WatchSource:0}: Error finding container aa5d92b44dfced168769cce301e31ddc55675c8df921051bc20c57aeebe7d9fe: Status 404 returned error can't find the container with id aa5d92b44dfced168769cce301e31ddc55675c8df921051bc20c57aeebe7d9fe Mar 19 12:17:04.089712 master-0 kubenswrapper[17644]: I0319 12:17:04.086684 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-5v79v"] Mar 19 12:17:04.223907 master-0 kubenswrapper[17644]: I0319 12:17:04.222997 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-657ddb758d-h2zns" event={"ID":"db5ffad0-a78e-4f97-a915-39bf347b53ca","Type":"ContainerStarted","Data":"5c110bcae074547160bfd31df357aff5eac3d8b68392c7e2c42fd664366c180d"} Mar 19 12:17:04.223907 master-0 kubenswrapper[17644]: I0319 12:17:04.223061 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-657ddb758d-h2zns" event={"ID":"db5ffad0-a78e-4f97-a915-39bf347b53ca","Type":"ContainerStarted","Data":"490c5fedfe815e4c3eb5bc528a4b7b33f5cbfcb5b505fc7651be063b816bd4e4"} Mar 19 12:17:04.235273 master-0 kubenswrapper[17644]: I0319 12:17:04.225011 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5v79v" event={"ID":"bda90bb9-a85d-4dba-b00b-7721557694bc","Type":"ContainerStarted","Data":"aa5d92b44dfced168769cce301e31ddc55675c8df921051bc20c57aeebe7d9fe"} Mar 19 12:17:04.235273 master-0 kubenswrapper[17644]: I0319 12:17:04.226978 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-7hz8v" event={"ID":"defe43e5-1621-42e7-9e79-bc48c2bbfb5c","Type":"ContainerStarted","Data":"5105bc4396c92880b50e453c30dbd323293b0869ad5135cf1a9c0954cd009e20"} Mar 19 12:17:04.235273 master-0 kubenswrapper[17644]: I0319 12:17:04.233824 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-94bfp" event={"ID":"8d8c7433-5218-4713-9e76-1c94175acd1c","Type":"ContainerStarted","Data":"2454538ce9d37f7a2af95e990823cc9e92f1545dbe49e5652b1c545cfce3833a"} Mar 19 12:17:04.260007 master-0 kubenswrapper[17644]: I0319 12:17:04.259883 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-657ddb758d-h2zns" podStartSLOduration=2.25878163 podStartE2EDuration="2.25878163s" podCreationTimestamp="2026-03-19 12:17:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:17:04.24599023 +0000 UTC m=+1058.015948285" watchObservedRunningTime="2026-03-19 12:17:04.25878163 +0000 UTC m=+1058.028739665" Mar 19 12:17:05.249058 master-0 kubenswrapper[17644]: I0319 12:17:05.248989 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-q82hx" event={"ID":"63931868-d8f1-4227-b911-81d786835fbb","Type":"ContainerStarted","Data":"e50f264fb8eb906272c4e8624a658ad3aaedb0185c4c6e5838d3132102c39d1a"} Mar 19 12:17:05.249718 master-0 kubenswrapper[17644]: I0319 12:17:05.249682 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-q82hx" Mar 19 12:17:05.273409 master-0 kubenswrapper[17644]: I0319 12:17:05.273338 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-q82hx" podStartSLOduration=3.65879802 podStartE2EDuration="5.273316841s" podCreationTimestamp="2026-03-19 12:17:00 +0000 UTC" firstStartedPulling="2026-03-19 12:17:02.392160421 +0000 UTC m=+1056.162118456" lastFinishedPulling="2026-03-19 12:17:04.006679242 +0000 UTC m=+1057.776637277" observedRunningTime="2026-03-19 12:17:05.27327521 +0000 UTC m=+1059.043233255" watchObservedRunningTime="2026-03-19 12:17:05.273316841 +0000 UTC m=+1059.043274876" Mar 19 12:17:10.331454 master-0 kubenswrapper[17644]: I0319 12:17:10.331381 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5v79v" event={"ID":"bda90bb9-a85d-4dba-b00b-7721557694bc","Type":"ContainerStarted","Data":"5624ddd8ada4f4b2c55c9e45e4010e19f01eff2a628940c0bb418b72e9467cce"} Mar 19 12:17:10.333987 master-0 kubenswrapper[17644]: I0319 12:17:10.333936 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-pbgxp" event={"ID":"e6f7e8f5-8cca-400e-9eea-2961d3f9920f","Type":"ContainerStarted","Data":"cbebe2b49f8275379e60e09301c712cce1b8e376d25a4e938ada8177357e7a8d"} Mar 19 12:17:10.334560 master-0 kubenswrapper[17644]: I0319 12:17:10.334533 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-pbgxp" Mar 19 12:17:10.336156 master-0 kubenswrapper[17644]: I0319 12:17:10.336128 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-7hz8v" event={"ID":"defe43e5-1621-42e7-9e79-bc48c2bbfb5c","Type":"ContainerStarted","Data":"eb7575f187609ace9c3c34c6c006e330d8224668da16d944559d5a66d57a81f6"} Mar 19 12:17:10.336156 master-0 kubenswrapper[17644]: I0319 12:17:10.336153 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-7hz8v" event={"ID":"defe43e5-1621-42e7-9e79-bc48c2bbfb5c","Type":"ContainerStarted","Data":"06bd13166b4229484caef897883a8261a0e7945a7e905b0c156b0f739513038a"} Mar 19 12:17:10.337634 master-0 kubenswrapper[17644]: I0319 12:17:10.337592 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-94bfp" event={"ID":"8d8c7433-5218-4713-9e76-1c94175acd1c","Type":"ContainerStarted","Data":"dde62eff62b0513d19411c0b0f9faf241452333668cf62cfdb2cbcb484c9098c"} Mar 19 12:17:10.337998 master-0 kubenswrapper[17644]: I0319 12:17:10.337962 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-94bfp" Mar 19 12:17:10.340812 master-0 kubenswrapper[17644]: I0319 12:17:10.339267 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rj2bt" event={"ID":"8c5e6757-8626-4cd1-8736-b41978d173f1","Type":"ContainerStarted","Data":"f44c1331d4901a9bc8772342cd69c00584a0afefdf457d9755f2289be0d9ad62"} Mar 19 12:17:10.340812 master-0 kubenswrapper[17644]: I0319 12:17:10.339361 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rj2bt" Mar 19 12:17:10.340919 master-0 kubenswrapper[17644]: I0319 12:17:10.340885 17644 generic.go:334] "Generic (PLEG): container finished" podID="75793445-a8c5-4cf7-8d0b-561fae8411fe" containerID="6c419719e90d0d27a1717c6c40b68177fcd0420c8ae8f5df3fe3a2fc1342f3b4" exitCode=0 Mar 19 12:17:10.340919 master-0 kubenswrapper[17644]: I0319 12:17:10.340910 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tcwpc" event={"ID":"75793445-a8c5-4cf7-8d0b-561fae8411fe","Type":"ContainerDied","Data":"6c419719e90d0d27a1717c6c40b68177fcd0420c8ae8f5df3fe3a2fc1342f3b4"} Mar 19 12:17:10.369482 master-0 kubenswrapper[17644]: I0319 12:17:10.368965 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5v79v" podStartSLOduration=2.731066412 podStartE2EDuration="8.368948641s" podCreationTimestamp="2026-03-19 12:17:02 +0000 UTC" firstStartedPulling="2026-03-19 12:17:04.090588518 +0000 UTC m=+1057.860546553" lastFinishedPulling="2026-03-19 12:17:09.728470747 +0000 UTC m=+1063.498428782" observedRunningTime="2026-03-19 12:17:10.357834191 +0000 UTC m=+1064.127792246" watchObservedRunningTime="2026-03-19 12:17:10.368948641 +0000 UTC m=+1064.138906676" Mar 19 12:17:10.421247 master-0 kubenswrapper[17644]: I0319 12:17:10.419421 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rj2bt" podStartSLOduration=1.701759537 podStartE2EDuration="10.419395395s" podCreationTimestamp="2026-03-19 12:17:00 +0000 UTC" firstStartedPulling="2026-03-19 12:17:00.935808329 +0000 UTC m=+1054.705766374" lastFinishedPulling="2026-03-19 12:17:09.653444197 +0000 UTC m=+1063.423402232" observedRunningTime="2026-03-19 12:17:10.382354445 +0000 UTC m=+1064.152312500" watchObservedRunningTime="2026-03-19 12:17:10.419395395 +0000 UTC m=+1064.189353430" Mar 19 12:17:10.424420 master-0 kubenswrapper[17644]: I0319 12:17:10.424332 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-pbgxp" podStartSLOduration=1.682244998 podStartE2EDuration="8.424307134s" podCreationTimestamp="2026-03-19 12:17:02 +0000 UTC" firstStartedPulling="2026-03-19 12:17:02.983072831 +0000 UTC m=+1056.753030866" lastFinishedPulling="2026-03-19 12:17:09.725134967 +0000 UTC m=+1063.495093002" observedRunningTime="2026-03-19 12:17:10.400587879 +0000 UTC m=+1064.170545914" watchObservedRunningTime="2026-03-19 12:17:10.424307134 +0000 UTC m=+1064.194265189" Mar 19 12:17:10.474542 master-0 kubenswrapper[17644]: I0319 12:17:10.474460 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-94bfp" podStartSLOduration=2.798858555 podStartE2EDuration="8.4744374s" podCreationTimestamp="2026-03-19 12:17:02 +0000 UTC" firstStartedPulling="2026-03-19 12:17:04.047615175 +0000 UTC m=+1057.817573210" lastFinishedPulling="2026-03-19 12:17:09.72319403 +0000 UTC m=+1063.493152055" observedRunningTime="2026-03-19 12:17:10.420574314 +0000 UTC m=+1064.190532369" watchObservedRunningTime="2026-03-19 12:17:10.4744374 +0000 UTC m=+1064.244395445" Mar 19 12:17:10.500963 master-0 kubenswrapper[17644]: I0319 12:17:10.500850 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-7hz8v" podStartSLOduration=2.232044671 podStartE2EDuration="8.500832311s" podCreationTimestamp="2026-03-19 12:17:02 +0000 UTC" firstStartedPulling="2026-03-19 12:17:03.444101269 +0000 UTC m=+1057.214059304" lastFinishedPulling="2026-03-19 12:17:09.712888909 +0000 UTC m=+1063.482846944" observedRunningTime="2026-03-19 12:17:10.498476834 +0000 UTC m=+1064.268434889" watchObservedRunningTime="2026-03-19 12:17:10.500832311 +0000 UTC m=+1064.270790336" Mar 19 12:17:11.352260 master-0 kubenswrapper[17644]: I0319 12:17:11.352195 17644 generic.go:334] "Generic (PLEG): container finished" podID="75793445-a8c5-4cf7-8d0b-561fae8411fe" containerID="0f0a974a5d17494d2dc9a796ea700cb58493edd3eb41aad0292be5ff8b6903d2" exitCode=0 Mar 19 12:17:11.353354 master-0 kubenswrapper[17644]: I0319 12:17:11.353309 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tcwpc" event={"ID":"75793445-a8c5-4cf7-8d0b-561fae8411fe","Type":"ContainerDied","Data":"0f0a974a5d17494d2dc9a796ea700cb58493edd3eb41aad0292be5ff8b6903d2"} Mar 19 12:17:12.116325 master-0 kubenswrapper[17644]: I0319 12:17:12.116288 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-q82hx" Mar 19 12:17:12.363506 master-0 kubenswrapper[17644]: I0319 12:17:12.363451 17644 generic.go:334] "Generic (PLEG): container finished" podID="75793445-a8c5-4cf7-8d0b-561fae8411fe" containerID="3f909f486f6878d1b3b33f0ad7935b4d54379b04eea72b94bf607094db5b7ed4" exitCode=0 Mar 19 12:17:12.364278 master-0 kubenswrapper[17644]: I0319 12:17:12.363514 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tcwpc" event={"ID":"75793445-a8c5-4cf7-8d0b-561fae8411fe","Type":"ContainerDied","Data":"3f909f486f6878d1b3b33f0ad7935b4d54379b04eea72b94bf607094db5b7ed4"} Mar 19 12:17:13.377874 master-0 kubenswrapper[17644]: I0319 12:17:13.376615 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tcwpc" event={"ID":"75793445-a8c5-4cf7-8d0b-561fae8411fe","Type":"ContainerStarted","Data":"952059f6fe946734606e76a705b7eebeda923de840d49c6f9121c06ed8a856ad"} Mar 19 12:17:13.377874 master-0 kubenswrapper[17644]: I0319 12:17:13.376666 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tcwpc" event={"ID":"75793445-a8c5-4cf7-8d0b-561fae8411fe","Type":"ContainerStarted","Data":"f9d98d24e91772d01992fb19786ba5e55b6736cc9e66c3895ff13f6392e89bdb"} Mar 19 12:17:13.377874 master-0 kubenswrapper[17644]: I0319 12:17:13.376692 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tcwpc" event={"ID":"75793445-a8c5-4cf7-8d0b-561fae8411fe","Type":"ContainerStarted","Data":"48b59ede442233b666073b83c5ea590e3afd7705ee872f2c4e1cb98520fcab12"} Mar 19 12:17:13.377874 master-0 kubenswrapper[17644]: I0319 12:17:13.376700 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tcwpc" event={"ID":"75793445-a8c5-4cf7-8d0b-561fae8411fe","Type":"ContainerStarted","Data":"4bee2569f2d20e3e6a1ac96c905026c78684f33abb47abd484bdf6b538204758"} Mar 19 12:17:13.377874 master-0 kubenswrapper[17644]: I0319 12:17:13.376709 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tcwpc" event={"ID":"75793445-a8c5-4cf7-8d0b-561fae8411fe","Type":"ContainerStarted","Data":"41bde6a4d0885c2a29df30033132467a84175dc31c833d94bb99ee01cca9fc7d"} Mar 19 12:17:13.380790 master-0 kubenswrapper[17644]: I0319 12:17:13.380758 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:13.381109 master-0 kubenswrapper[17644]: I0319 12:17:13.381090 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:13.388051 master-0 kubenswrapper[17644]: I0319 12:17:13.388010 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:14.389428 master-0 kubenswrapper[17644]: I0319 12:17:14.389345 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tcwpc" event={"ID":"75793445-a8c5-4cf7-8d0b-561fae8411fe","Type":"ContainerStarted","Data":"fdb7fcd833502419b4c7260c1e955bf4774f04c46e51311b99b214eb28e1c900"} Mar 19 12:17:14.392866 master-0 kubenswrapper[17644]: I0319 12:17:14.392820 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-657ddb758d-h2zns" Mar 19 12:17:14.417192 master-0 kubenswrapper[17644]: I0319 12:17:14.417069 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-tcwpc" podStartSLOduration=5.380491341 podStartE2EDuration="14.417050369s" podCreationTimestamp="2026-03-19 12:17:00 +0000 UTC" firstStartedPulling="2026-03-19 12:17:00.685264949 +0000 UTC m=+1054.455222984" lastFinishedPulling="2026-03-19 12:17:09.721823977 +0000 UTC m=+1063.491782012" observedRunningTime="2026-03-19 12:17:14.413951784 +0000 UTC m=+1068.183909869" watchObservedRunningTime="2026-03-19 12:17:14.417050369 +0000 UTC m=+1068.187008414" Mar 19 12:17:14.509941 master-0 kubenswrapper[17644]: I0319 12:17:14.508700 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69c7fd464c-4x4r7"] Mar 19 12:17:15.396512 master-0 kubenswrapper[17644]: I0319 12:17:15.396449 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:15.524569 master-0 kubenswrapper[17644]: I0319 12:17:15.524524 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:15.565074 master-0 kubenswrapper[17644]: I0319 12:17:15.565025 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:17.914995 master-0 kubenswrapper[17644]: I0319 12:17:17.914917 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-pbgxp" Mar 19 12:17:20.509383 master-0 kubenswrapper[17644]: I0319 12:17:20.509298 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rj2bt" Mar 19 12:17:20.645488 master-0 kubenswrapper[17644]: I0319 12:17:20.645422 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-46b4x" Mar 19 12:17:23.477770 master-0 kubenswrapper[17644]: I0319 12:17:23.477680 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-94bfp" Mar 19 12:17:28.301717 master-0 kubenswrapper[17644]: I0319 12:17:28.301596 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/vg-manager-7fdjj"] Mar 19 12:17:28.302721 master-0 kubenswrapper[17644]: I0319 12:17:28.302698 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.308237 master-0 kubenswrapper[17644]: I0319 12:17:28.308196 17644 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"vg-manager-metrics-cert" Mar 19 12:17:28.320407 master-0 kubenswrapper[17644]: I0319 12:17:28.311639 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-7fdjj"] Mar 19 12:17:28.472117 master-0 kubenswrapper[17644]: I0319 12:17:28.472061 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-lvmd-config\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.472407 master-0 kubenswrapper[17644]: I0319 12:17:28.472388 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-run-udev\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.472530 master-0 kubenswrapper[17644]: I0319 12:17:28.472516 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-node-plugin-dir\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.472622 master-0 kubenswrapper[17644]: I0319 12:17:28.472610 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-sys\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.472776 master-0 kubenswrapper[17644]: I0319 12:17:28.472754 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-file-lock-dir\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.472896 master-0 kubenswrapper[17644]: I0319 12:17:28.472879 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/353e7c87-ddee-472d-8a41-a4fc62ded137-metrics-cert\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.472992 master-0 kubenswrapper[17644]: I0319 12:17:28.472976 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwzrv\" (UniqueName: \"kubernetes.io/projected/353e7c87-ddee-472d-8a41-a4fc62ded137-kube-api-access-fwzrv\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.473069 master-0 kubenswrapper[17644]: I0319 12:17:28.473057 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-csi-plugin-dir\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.473162 master-0 kubenswrapper[17644]: I0319 12:17:28.473148 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-device-dir\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.473252 master-0 kubenswrapper[17644]: I0319 12:17:28.473237 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-registration-dir\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.473387 master-0 kubenswrapper[17644]: I0319 12:17:28.473345 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-pod-volumes-dir\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.574690 master-0 kubenswrapper[17644]: I0319 12:17:28.574501 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-sys\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.574690 master-0 kubenswrapper[17644]: I0319 12:17:28.574623 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-file-lock-dir\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.575027 master-0 kubenswrapper[17644]: I0319 12:17:28.574689 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-sys\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.575027 master-0 kubenswrapper[17644]: I0319 12:17:28.574821 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/353e7c87-ddee-472d-8a41-a4fc62ded137-metrics-cert\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.575027 master-0 kubenswrapper[17644]: I0319 12:17:28.574867 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwzrv\" (UniqueName: \"kubernetes.io/projected/353e7c87-ddee-472d-8a41-a4fc62ded137-kube-api-access-fwzrv\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.575027 master-0 kubenswrapper[17644]: I0319 12:17:28.574910 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-csi-plugin-dir\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.575447 master-0 kubenswrapper[17644]: I0319 12:17:28.575047 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-device-dir\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.575447 master-0 kubenswrapper[17644]: I0319 12:17:28.575105 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-registration-dir\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.575447 master-0 kubenswrapper[17644]: I0319 12:17:28.575189 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-device-dir\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.575447 master-0 kubenswrapper[17644]: I0319 12:17:28.575200 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-file-lock-dir\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.575447 master-0 kubenswrapper[17644]: I0319 12:17:28.575389 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-csi-plugin-dir\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.575447 master-0 kubenswrapper[17644]: I0319 12:17:28.575429 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-pod-volumes-dir\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.575447 master-0 kubenswrapper[17644]: I0319 12:17:28.575450 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-lvmd-config\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.575904 master-0 kubenswrapper[17644]: I0319 12:17:28.575491 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-run-udev\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.575904 master-0 kubenswrapper[17644]: I0319 12:17:28.575535 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-node-plugin-dir\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.575904 master-0 kubenswrapper[17644]: I0319 12:17:28.575680 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-node-plugin-dir\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.575904 master-0 kubenswrapper[17644]: I0319 12:17:28.575718 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-registration-dir\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.575904 master-0 kubenswrapper[17644]: I0319 12:17:28.575795 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-pod-volumes-dir\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.576082 master-0 kubenswrapper[17644]: I0319 12:17:28.575990 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-lvmd-config\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.576288 master-0 kubenswrapper[17644]: I0319 12:17:28.576251 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/353e7c87-ddee-472d-8a41-a4fc62ded137-run-udev\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.578839 master-0 kubenswrapper[17644]: I0319 12:17:28.578621 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/353e7c87-ddee-472d-8a41-a4fc62ded137-metrics-cert\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.594479 master-0 kubenswrapper[17644]: I0319 12:17:28.594429 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwzrv\" (UniqueName: \"kubernetes.io/projected/353e7c87-ddee-472d-8a41-a4fc62ded137-kube-api-access-fwzrv\") pod \"vg-manager-7fdjj\" (UID: \"353e7c87-ddee-472d-8a41-a4fc62ded137\") " pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:28.633052 master-0 kubenswrapper[17644]: I0319 12:17:28.632999 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:29.114694 master-0 kubenswrapper[17644]: I0319 12:17:29.114633 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-7fdjj"] Mar 19 12:17:29.116144 master-0 kubenswrapper[17644]: W0319 12:17:29.116088 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod353e7c87_ddee_472d_8a41_a4fc62ded137.slice/crio-b2453408c0feb276d2f02a5114b57640958edc59c09060f6b723d5983c23aa4c WatchSource:0}: Error finding container b2453408c0feb276d2f02a5114b57640958edc59c09060f6b723d5983c23aa4c: Status 404 returned error can't find the container with id b2453408c0feb276d2f02a5114b57640958edc59c09060f6b723d5983c23aa4c Mar 19 12:17:29.515628 master-0 kubenswrapper[17644]: I0319 12:17:29.515568 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-7fdjj" event={"ID":"353e7c87-ddee-472d-8a41-a4fc62ded137","Type":"ContainerStarted","Data":"52685d1bbb62f11f046d062ee74b574e0e234c1f33933c340c62dfb552b1ec45"} Mar 19 12:17:29.515628 master-0 kubenswrapper[17644]: I0319 12:17:29.515627 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-7fdjj" event={"ID":"353e7c87-ddee-472d-8a41-a4fc62ded137","Type":"ContainerStarted","Data":"b2453408c0feb276d2f02a5114b57640958edc59c09060f6b723d5983c23aa4c"} Mar 19 12:17:29.542047 master-0 kubenswrapper[17644]: I0319 12:17:29.541935 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/vg-manager-7fdjj" podStartSLOduration=1.5419190170000001 podStartE2EDuration="1.541919017s" podCreationTimestamp="2026-03-19 12:17:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:17:29.538433131 +0000 UTC m=+1083.308391186" watchObservedRunningTime="2026-03-19 12:17:29.541919017 +0000 UTC m=+1083.311877052" Mar 19 12:17:30.532781 master-0 kubenswrapper[17644]: I0319 12:17:30.532601 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-tcwpc" Mar 19 12:17:31.535793 master-0 kubenswrapper[17644]: I0319 12:17:31.534864 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-7fdjj_353e7c87-ddee-472d-8a41-a4fc62ded137/vg-manager/0.log" Mar 19 12:17:31.535793 master-0 kubenswrapper[17644]: I0319 12:17:31.535005 17644 generic.go:334] "Generic (PLEG): container finished" podID="353e7c87-ddee-472d-8a41-a4fc62ded137" containerID="52685d1bbb62f11f046d062ee74b574e0e234c1f33933c340c62dfb552b1ec45" exitCode=1 Mar 19 12:17:31.535793 master-0 kubenswrapper[17644]: I0319 12:17:31.535050 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-7fdjj" event={"ID":"353e7c87-ddee-472d-8a41-a4fc62ded137","Type":"ContainerDied","Data":"52685d1bbb62f11f046d062ee74b574e0e234c1f33933c340c62dfb552b1ec45"} Mar 19 12:17:31.536440 master-0 kubenswrapper[17644]: I0319 12:17:31.535892 17644 scope.go:117] "RemoveContainer" containerID="52685d1bbb62f11f046d062ee74b574e0e234c1f33933c340c62dfb552b1ec45" Mar 19 12:17:31.854665 master-0 kubenswrapper[17644]: I0319 12:17:31.854548 17644 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock" Mar 19 12:17:31.932018 master-0 kubenswrapper[17644]: I0319 12:17:31.931838 17644 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock","Timestamp":"2026-03-19T12:17:31.85460265Z","Handler":null,"Name":""} Mar 19 12:17:31.934362 master-0 kubenswrapper[17644]: I0319 12:17:31.934319 17644 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: topolvm.io endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock versions: 1.0.0 Mar 19 12:17:31.934463 master-0 kubenswrapper[17644]: I0319 12:17:31.934365 17644 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: topolvm.io at endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock Mar 19 12:17:32.547681 master-0 kubenswrapper[17644]: I0319 12:17:32.547619 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-7fdjj_353e7c87-ddee-472d-8a41-a4fc62ded137/vg-manager/0.log" Mar 19 12:17:32.547681 master-0 kubenswrapper[17644]: I0319 12:17:32.547691 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-7fdjj" event={"ID":"353e7c87-ddee-472d-8a41-a4fc62ded137","Type":"ContainerStarted","Data":"ace744f26fae974ad61a00dbb0810d1e7f5e1fe6dcb7d6449aa524f975fd2ffe"} Mar 19 12:17:34.925811 master-0 kubenswrapper[17644]: I0319 12:17:34.925749 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-h48q4"] Mar 19 12:17:34.927601 master-0 kubenswrapper[17644]: I0319 12:17:34.927007 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-h48q4" Mar 19 12:17:34.931756 master-0 kubenswrapper[17644]: I0319 12:17:34.929204 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 19 12:17:34.933261 master-0 kubenswrapper[17644]: I0319 12:17:34.932633 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 19 12:17:34.966404 master-0 kubenswrapper[17644]: I0319 12:17:34.966345 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-h48q4"] Mar 19 12:17:35.009966 master-0 kubenswrapper[17644]: I0319 12:17:35.009893 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvw28\" (UniqueName: \"kubernetes.io/projected/7a82b912-f1bc-4c67-8a31-75a9b3cdb00e-kube-api-access-fvw28\") pod \"openstack-operator-index-h48q4\" (UID: \"7a82b912-f1bc-4c67-8a31-75a9b3cdb00e\") " pod="openstack-operators/openstack-operator-index-h48q4" Mar 19 12:17:35.111801 master-0 kubenswrapper[17644]: I0319 12:17:35.111711 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvw28\" (UniqueName: \"kubernetes.io/projected/7a82b912-f1bc-4c67-8a31-75a9b3cdb00e-kube-api-access-fvw28\") pod \"openstack-operator-index-h48q4\" (UID: \"7a82b912-f1bc-4c67-8a31-75a9b3cdb00e\") " pod="openstack-operators/openstack-operator-index-h48q4" Mar 19 12:17:35.127841 master-0 kubenswrapper[17644]: I0319 12:17:35.127799 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvw28\" (UniqueName: \"kubernetes.io/projected/7a82b912-f1bc-4c67-8a31-75a9b3cdb00e-kube-api-access-fvw28\") pod \"openstack-operator-index-h48q4\" (UID: \"7a82b912-f1bc-4c67-8a31-75a9b3cdb00e\") " pod="openstack-operators/openstack-operator-index-h48q4" Mar 19 12:17:35.256497 master-0 kubenswrapper[17644]: I0319 12:17:35.256442 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-h48q4" Mar 19 12:17:35.726907 master-0 kubenswrapper[17644]: I0319 12:17:35.718629 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-h48q4"] Mar 19 12:17:36.583056 master-0 kubenswrapper[17644]: I0319 12:17:36.582991 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-h48q4" event={"ID":"7a82b912-f1bc-4c67-8a31-75a9b3cdb00e","Type":"ContainerStarted","Data":"90ed9d1d115e80bc07441579c0c6f7b208229ee32202b9a5a68a70314d15e94d"} Mar 19 12:17:37.596763 master-0 kubenswrapper[17644]: I0319 12:17:37.596656 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-h48q4" event={"ID":"7a82b912-f1bc-4c67-8a31-75a9b3cdb00e","Type":"ContainerStarted","Data":"9021306bdba00a246d0eaa57a1593592d9bb7b154807af7c6d5663a0c9f0efb3"} Mar 19 12:17:37.666678 master-0 kubenswrapper[17644]: I0319 12:17:37.666551 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-h48q4" podStartSLOduration=2.892675232 podStartE2EDuration="3.666523272s" podCreationTimestamp="2026-03-19 12:17:34 +0000 UTC" firstStartedPulling="2026-03-19 12:17:35.742857079 +0000 UTC m=+1089.512815114" lastFinishedPulling="2026-03-19 12:17:36.516705109 +0000 UTC m=+1090.286663154" observedRunningTime="2026-03-19 12:17:37.659648195 +0000 UTC m=+1091.429606240" watchObservedRunningTime="2026-03-19 12:17:37.666523272 +0000 UTC m=+1091.436481307" Mar 19 12:17:38.634974 master-0 kubenswrapper[17644]: I0319 12:17:38.634904 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:38.640186 master-0 kubenswrapper[17644]: I0319 12:17:38.640150 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:39.072749 master-0 kubenswrapper[17644]: I0319 12:17:39.072640 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-h48q4"] Mar 19 12:17:39.569786 master-0 kubenswrapper[17644]: I0319 12:17:39.569686 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-69c7fd464c-4x4r7" podUID="ba2a45c7-d196-489e-992c-ed8553206ced" containerName="console" containerID="cri-o://cf0853f83eda5d377b3accb257e570154d9a2bfd451ff4a0d2e0c016ec32b8bb" gracePeriod=15 Mar 19 12:17:39.613401 master-0 kubenswrapper[17644]: I0319 12:17:39.613350 17644 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-h48q4" podUID="7a82b912-f1bc-4c67-8a31-75a9b3cdb00e" containerName="registry-server" containerID="cri-o://9021306bdba00a246d0eaa57a1593592d9bb7b154807af7c6d5663a0c9f0efb3" gracePeriod=2 Mar 19 12:17:39.614204 master-0 kubenswrapper[17644]: I0319 12:17:39.613885 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:39.615944 master-0 kubenswrapper[17644]: I0319 12:17:39.615916 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/vg-manager-7fdjj" Mar 19 12:17:39.678633 master-0 kubenswrapper[17644]: I0319 12:17:39.678565 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-76j62"] Mar 19 12:17:39.680537 master-0 kubenswrapper[17644]: I0319 12:17:39.680492 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-76j62" Mar 19 12:17:39.693855 master-0 kubenswrapper[17644]: I0319 12:17:39.693807 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-76j62"] Mar 19 12:17:39.749143 master-0 kubenswrapper[17644]: I0319 12:17:39.749033 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkxpp\" (UniqueName: \"kubernetes.io/projected/aff4c24a-63b7-44ea-86de-c543b1afd15f-kube-api-access-bkxpp\") pod \"openstack-operator-index-76j62\" (UID: \"aff4c24a-63b7-44ea-86de-c543b1afd15f\") " pod="openstack-operators/openstack-operator-index-76j62" Mar 19 12:17:39.868185 master-0 kubenswrapper[17644]: I0319 12:17:39.866560 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkxpp\" (UniqueName: \"kubernetes.io/projected/aff4c24a-63b7-44ea-86de-c543b1afd15f-kube-api-access-bkxpp\") pod \"openstack-operator-index-76j62\" (UID: \"aff4c24a-63b7-44ea-86de-c543b1afd15f\") " pod="openstack-operators/openstack-operator-index-76j62" Mar 19 12:17:39.885825 master-0 kubenswrapper[17644]: I0319 12:17:39.885383 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkxpp\" (UniqueName: \"kubernetes.io/projected/aff4c24a-63b7-44ea-86de-c543b1afd15f-kube-api-access-bkxpp\") pod \"openstack-operator-index-76j62\" (UID: \"aff4c24a-63b7-44ea-86de-c543b1afd15f\") " pod="openstack-operators/openstack-operator-index-76j62" Mar 19 12:17:40.170033 master-0 kubenswrapper[17644]: I0319 12:17:40.169999 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-76j62" Mar 19 12:17:40.178009 master-0 kubenswrapper[17644]: I0319 12:17:40.176625 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69c7fd464c-4x4r7_ba2a45c7-d196-489e-992c-ed8553206ced/console/0.log" Mar 19 12:17:40.178009 master-0 kubenswrapper[17644]: I0319 12:17:40.176686 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:17:40.200510 master-0 kubenswrapper[17644]: I0319 12:17:40.198510 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-h48q4" Mar 19 12:17:40.271525 master-0 kubenswrapper[17644]: I0319 12:17:40.271464 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba2a45c7-d196-489e-992c-ed8553206ced-service-ca\") pod \"ba2a45c7-d196-489e-992c-ed8553206ced\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " Mar 19 12:17:40.271806 master-0 kubenswrapper[17644]: I0319 12:17:40.271557 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba2a45c7-d196-489e-992c-ed8553206ced-console-oauth-config\") pod \"ba2a45c7-d196-489e-992c-ed8553206ced\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " Mar 19 12:17:40.271806 master-0 kubenswrapper[17644]: I0319 12:17:40.271600 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvw28\" (UniqueName: \"kubernetes.io/projected/7a82b912-f1bc-4c67-8a31-75a9b3cdb00e-kube-api-access-fvw28\") pod \"7a82b912-f1bc-4c67-8a31-75a9b3cdb00e\" (UID: \"7a82b912-f1bc-4c67-8a31-75a9b3cdb00e\") " Mar 19 12:17:40.271806 master-0 kubenswrapper[17644]: I0319 12:17:40.271650 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba2a45c7-d196-489e-992c-ed8553206ced-console-config\") pod \"ba2a45c7-d196-489e-992c-ed8553206ced\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " Mar 19 12:17:40.271806 master-0 kubenswrapper[17644]: I0319 12:17:40.271684 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba2a45c7-d196-489e-992c-ed8553206ced-oauth-serving-cert\") pod \"ba2a45c7-d196-489e-992c-ed8553206ced\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " Mar 19 12:17:40.271806 master-0 kubenswrapper[17644]: I0319 12:17:40.271783 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba2a45c7-d196-489e-992c-ed8553206ced-console-serving-cert\") pod \"ba2a45c7-d196-489e-992c-ed8553206ced\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " Mar 19 12:17:40.272059 master-0 kubenswrapper[17644]: I0319 12:17:40.271824 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt4b7\" (UniqueName: \"kubernetes.io/projected/ba2a45c7-d196-489e-992c-ed8553206ced-kube-api-access-mt4b7\") pod \"ba2a45c7-d196-489e-992c-ed8553206ced\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " Mar 19 12:17:40.272059 master-0 kubenswrapper[17644]: I0319 12:17:40.271861 17644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba2a45c7-d196-489e-992c-ed8553206ced-trusted-ca-bundle\") pod \"ba2a45c7-d196-489e-992c-ed8553206ced\" (UID: \"ba2a45c7-d196-489e-992c-ed8553206ced\") " Mar 19 12:17:40.273110 master-0 kubenswrapper[17644]: I0319 12:17:40.272940 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba2a45c7-d196-489e-992c-ed8553206ced-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ba2a45c7-d196-489e-992c-ed8553206ced" (UID: "ba2a45c7-d196-489e-992c-ed8553206ced"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:17:40.273110 master-0 kubenswrapper[17644]: I0319 12:17:40.273098 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba2a45c7-d196-489e-992c-ed8553206ced-console-config" (OuterVolumeSpecName: "console-config") pod "ba2a45c7-d196-489e-992c-ed8553206ced" (UID: "ba2a45c7-d196-489e-992c-ed8553206ced"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:17:40.273370 master-0 kubenswrapper[17644]: I0319 12:17:40.273108 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba2a45c7-d196-489e-992c-ed8553206ced-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ba2a45c7-d196-489e-992c-ed8553206ced" (UID: "ba2a45c7-d196-489e-992c-ed8553206ced"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:17:40.274072 master-0 kubenswrapper[17644]: I0319 12:17:40.274033 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba2a45c7-d196-489e-992c-ed8553206ced-service-ca" (OuterVolumeSpecName: "service-ca") pod "ba2a45c7-d196-489e-992c-ed8553206ced" (UID: "ba2a45c7-d196-489e-992c-ed8553206ced"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:17:40.276146 master-0 kubenswrapper[17644]: I0319 12:17:40.275918 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba2a45c7-d196-489e-992c-ed8553206ced-kube-api-access-mt4b7" (OuterVolumeSpecName: "kube-api-access-mt4b7") pod "ba2a45c7-d196-489e-992c-ed8553206ced" (UID: "ba2a45c7-d196-489e-992c-ed8553206ced"). InnerVolumeSpecName "kube-api-access-mt4b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:17:40.276452 master-0 kubenswrapper[17644]: I0319 12:17:40.276412 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba2a45c7-d196-489e-992c-ed8553206ced-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ba2a45c7-d196-489e-992c-ed8553206ced" (UID: "ba2a45c7-d196-489e-992c-ed8553206ced"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:17:40.276736 master-0 kubenswrapper[17644]: I0319 12:17:40.276657 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a82b912-f1bc-4c67-8a31-75a9b3cdb00e-kube-api-access-fvw28" (OuterVolumeSpecName: "kube-api-access-fvw28") pod "7a82b912-f1bc-4c67-8a31-75a9b3cdb00e" (UID: "7a82b912-f1bc-4c67-8a31-75a9b3cdb00e"). InnerVolumeSpecName "kube-api-access-fvw28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:17:40.277313 master-0 kubenswrapper[17644]: I0319 12:17:40.277271 17644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba2a45c7-d196-489e-992c-ed8553206ced-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ba2a45c7-d196-489e-992c-ed8553206ced" (UID: "ba2a45c7-d196-489e-992c-ed8553206ced"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:17:40.376844 master-0 kubenswrapper[17644]: I0319 12:17:40.374002 17644 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba2a45c7-d196-489e-992c-ed8553206ced-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:17:40.376844 master-0 kubenswrapper[17644]: I0319 12:17:40.374054 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvw28\" (UniqueName: \"kubernetes.io/projected/7a82b912-f1bc-4c67-8a31-75a9b3cdb00e-kube-api-access-fvw28\") on node \"master-0\" DevicePath \"\"" Mar 19 12:17:40.376844 master-0 kubenswrapper[17644]: I0319 12:17:40.374071 17644 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba2a45c7-d196-489e-992c-ed8553206ced-console-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:17:40.376844 master-0 kubenswrapper[17644]: I0319 12:17:40.374081 17644 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba2a45c7-d196-489e-992c-ed8553206ced-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:17:40.376844 master-0 kubenswrapper[17644]: I0319 12:17:40.374091 17644 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba2a45c7-d196-489e-992c-ed8553206ced-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:17:40.376844 master-0 kubenswrapper[17644]: I0319 12:17:40.374101 17644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt4b7\" (UniqueName: \"kubernetes.io/projected/ba2a45c7-d196-489e-992c-ed8553206ced-kube-api-access-mt4b7\") on node \"master-0\" DevicePath \"\"" Mar 19 12:17:40.376844 master-0 kubenswrapper[17644]: I0319 12:17:40.374110 17644 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba2a45c7-d196-489e-992c-ed8553206ced-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:17:40.376844 master-0 kubenswrapper[17644]: I0319 12:17:40.374120 17644 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba2a45c7-d196-489e-992c-ed8553206ced-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 12:17:40.628110 master-0 kubenswrapper[17644]: I0319 12:17:40.628062 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69c7fd464c-4x4r7_ba2a45c7-d196-489e-992c-ed8553206ced/console/0.log" Mar 19 12:17:40.628317 master-0 kubenswrapper[17644]: I0319 12:17:40.628115 17644 generic.go:334] "Generic (PLEG): container finished" podID="ba2a45c7-d196-489e-992c-ed8553206ced" containerID="cf0853f83eda5d377b3accb257e570154d9a2bfd451ff4a0d2e0c016ec32b8bb" exitCode=2 Mar 19 12:17:40.628317 master-0 kubenswrapper[17644]: I0319 12:17:40.628166 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69c7fd464c-4x4r7" event={"ID":"ba2a45c7-d196-489e-992c-ed8553206ced","Type":"ContainerDied","Data":"cf0853f83eda5d377b3accb257e570154d9a2bfd451ff4a0d2e0c016ec32b8bb"} Mar 19 12:17:40.628317 master-0 kubenswrapper[17644]: I0319 12:17:40.628195 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69c7fd464c-4x4r7" event={"ID":"ba2a45c7-d196-489e-992c-ed8553206ced","Type":"ContainerDied","Data":"654dda2a3d2db2d059c73bd1494c9dce4722f56cdd611e08a15a178bda097a1b"} Mar 19 12:17:40.628317 master-0 kubenswrapper[17644]: I0319 12:17:40.628213 17644 scope.go:117] "RemoveContainer" containerID="cf0853f83eda5d377b3accb257e570154d9a2bfd451ff4a0d2e0c016ec32b8bb" Mar 19 12:17:40.628317 master-0 kubenswrapper[17644]: I0319 12:17:40.628304 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69c7fd464c-4x4r7" Mar 19 12:17:40.632339 master-0 kubenswrapper[17644]: I0319 12:17:40.632303 17644 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-h48q4" Mar 19 12:17:40.633663 master-0 kubenswrapper[17644]: I0319 12:17:40.633628 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-h48q4" event={"ID":"7a82b912-f1bc-4c67-8a31-75a9b3cdb00e","Type":"ContainerDied","Data":"9021306bdba00a246d0eaa57a1593592d9bb7b154807af7c6d5663a0c9f0efb3"} Mar 19 12:17:40.636795 master-0 kubenswrapper[17644]: I0319 12:17:40.636137 17644 generic.go:334] "Generic (PLEG): container finished" podID="7a82b912-f1bc-4c67-8a31-75a9b3cdb00e" containerID="9021306bdba00a246d0eaa57a1593592d9bb7b154807af7c6d5663a0c9f0efb3" exitCode=0 Mar 19 12:17:40.636795 master-0 kubenswrapper[17644]: I0319 12:17:40.636547 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-h48q4" event={"ID":"7a82b912-f1bc-4c67-8a31-75a9b3cdb00e","Type":"ContainerDied","Data":"90ed9d1d115e80bc07441579c0c6f7b208229ee32202b9a5a68a70314d15e94d"} Mar 19 12:17:40.637760 master-0 kubenswrapper[17644]: I0319 12:17:40.637694 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-76j62"] Mar 19 12:17:40.692257 master-0 kubenswrapper[17644]: I0319 12:17:40.692222 17644 scope.go:117] "RemoveContainer" containerID="cf0853f83eda5d377b3accb257e570154d9a2bfd451ff4a0d2e0c016ec32b8bb" Mar 19 12:17:40.693009 master-0 kubenswrapper[17644]: E0319 12:17:40.692981 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf0853f83eda5d377b3accb257e570154d9a2bfd451ff4a0d2e0c016ec32b8bb\": container with ID starting with cf0853f83eda5d377b3accb257e570154d9a2bfd451ff4a0d2e0c016ec32b8bb not found: ID does not exist" containerID="cf0853f83eda5d377b3accb257e570154d9a2bfd451ff4a0d2e0c016ec32b8bb" Mar 19 12:17:40.693074 master-0 kubenswrapper[17644]: I0319 12:17:40.693023 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf0853f83eda5d377b3accb257e570154d9a2bfd451ff4a0d2e0c016ec32b8bb"} err="failed to get container status \"cf0853f83eda5d377b3accb257e570154d9a2bfd451ff4a0d2e0c016ec32b8bb\": rpc error: code = NotFound desc = could not find container \"cf0853f83eda5d377b3accb257e570154d9a2bfd451ff4a0d2e0c016ec32b8bb\": container with ID starting with cf0853f83eda5d377b3accb257e570154d9a2bfd451ff4a0d2e0c016ec32b8bb not found: ID does not exist" Mar 19 12:17:40.693118 master-0 kubenswrapper[17644]: I0319 12:17:40.693076 17644 scope.go:117] "RemoveContainer" containerID="9021306bdba00a246d0eaa57a1593592d9bb7b154807af7c6d5663a0c9f0efb3" Mar 19 12:17:40.693443 master-0 kubenswrapper[17644]: I0319 12:17:40.693413 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69c7fd464c-4x4r7"] Mar 19 12:17:40.700516 master-0 kubenswrapper[17644]: I0319 12:17:40.700442 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-69c7fd464c-4x4r7"] Mar 19 12:17:40.722440 master-0 kubenswrapper[17644]: I0319 12:17:40.720966 17644 scope.go:117] "RemoveContainer" containerID="9021306bdba00a246d0eaa57a1593592d9bb7b154807af7c6d5663a0c9f0efb3" Mar 19 12:17:40.727275 master-0 kubenswrapper[17644]: I0319 12:17:40.727183 17644 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-h48q4"] Mar 19 12:17:40.727606 master-0 kubenswrapper[17644]: E0319 12:17:40.727452 17644 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9021306bdba00a246d0eaa57a1593592d9bb7b154807af7c6d5663a0c9f0efb3\": container with ID starting with 9021306bdba00a246d0eaa57a1593592d9bb7b154807af7c6d5663a0c9f0efb3 not found: ID does not exist" containerID="9021306bdba00a246d0eaa57a1593592d9bb7b154807af7c6d5663a0c9f0efb3" Mar 19 12:17:40.727606 master-0 kubenswrapper[17644]: I0319 12:17:40.727497 17644 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9021306bdba00a246d0eaa57a1593592d9bb7b154807af7c6d5663a0c9f0efb3"} err="failed to get container status \"9021306bdba00a246d0eaa57a1593592d9bb7b154807af7c6d5663a0c9f0efb3\": rpc error: code = NotFound desc = could not find container \"9021306bdba00a246d0eaa57a1593592d9bb7b154807af7c6d5663a0c9f0efb3\": container with ID starting with 9021306bdba00a246d0eaa57a1593592d9bb7b154807af7c6d5663a0c9f0efb3 not found: ID does not exist" Mar 19 12:17:40.740668 master-0 kubenswrapper[17644]: I0319 12:17:40.740594 17644 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-h48q4"] Mar 19 12:17:41.647368 master-0 kubenswrapper[17644]: I0319 12:17:41.647292 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-76j62" event={"ID":"aff4c24a-63b7-44ea-86de-c543b1afd15f","Type":"ContainerStarted","Data":"3533483017d3f5e6a93e751aa92ee5494d4ac39948806b53c7caaacecd73cf52"} Mar 19 12:17:41.647368 master-0 kubenswrapper[17644]: I0319 12:17:41.647355 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-76j62" event={"ID":"aff4c24a-63b7-44ea-86de-c543b1afd15f","Type":"ContainerStarted","Data":"86b8b4d8a81af2b1da123d049e857c591f0d1d84705e54f403296f4096873664"} Mar 19 12:17:41.677559 master-0 kubenswrapper[17644]: I0319 12:17:41.677397 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-76j62" podStartSLOduration=2.174693108 podStartE2EDuration="2.677372778s" podCreationTimestamp="2026-03-19 12:17:39 +0000 UTC" firstStartedPulling="2026-03-19 12:17:40.642713148 +0000 UTC m=+1094.412671183" lastFinishedPulling="2026-03-19 12:17:41.145392808 +0000 UTC m=+1094.915350853" observedRunningTime="2026-03-19 12:17:41.672328985 +0000 UTC m=+1095.442287030" watchObservedRunningTime="2026-03-19 12:17:41.677372778 +0000 UTC m=+1095.447330833" Mar 19 12:17:42.494113 master-0 kubenswrapper[17644]: I0319 12:17:42.494063 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a82b912-f1bc-4c67-8a31-75a9b3cdb00e" path="/var/lib/kubelet/pods/7a82b912-f1bc-4c67-8a31-75a9b3cdb00e/volumes" Mar 19 12:17:42.496710 master-0 kubenswrapper[17644]: I0319 12:17:42.496663 17644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba2a45c7-d196-489e-992c-ed8553206ced" path="/var/lib/kubelet/pods/ba2a45c7-d196-489e-992c-ed8553206ced/volumes" Mar 19 12:17:50.171189 master-0 kubenswrapper[17644]: I0319 12:17:50.171091 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-76j62" Mar 19 12:17:50.171189 master-0 kubenswrapper[17644]: I0319 12:17:50.171173 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-76j62" Mar 19 12:17:50.199373 master-0 kubenswrapper[17644]: I0319 12:17:50.199275 17644 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-76j62" Mar 19 12:17:50.760683 master-0 kubenswrapper[17644]: I0319 12:17:50.760635 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-76j62" Mar 19 12:22:46.467837 master-0 kubenswrapper[17644]: I0319 12:22:46.461209 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-84vlf/must-gather-r5msw"] Mar 19 12:22:46.468658 master-0 kubenswrapper[17644]: E0319 12:22:46.468345 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a82b912-f1bc-4c67-8a31-75a9b3cdb00e" containerName="registry-server" Mar 19 12:22:46.468658 master-0 kubenswrapper[17644]: I0319 12:22:46.468381 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a82b912-f1bc-4c67-8a31-75a9b3cdb00e" containerName="registry-server" Mar 19 12:22:46.468658 master-0 kubenswrapper[17644]: E0319 12:22:46.468407 17644 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2a45c7-d196-489e-992c-ed8553206ced" containerName="console" Mar 19 12:22:46.468658 master-0 kubenswrapper[17644]: I0319 12:22:46.468414 17644 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2a45c7-d196-489e-992c-ed8553206ced" containerName="console" Mar 19 12:22:46.468658 master-0 kubenswrapper[17644]: I0319 12:22:46.468598 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a82b912-f1bc-4c67-8a31-75a9b3cdb00e" containerName="registry-server" Mar 19 12:22:46.468658 master-0 kubenswrapper[17644]: I0319 12:22:46.468638 17644 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba2a45c7-d196-489e-992c-ed8553206ced" containerName="console" Mar 19 12:22:46.471809 master-0 kubenswrapper[17644]: I0319 12:22:46.469780 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-84vlf/must-gather-r5msw" Mar 19 12:22:46.471809 master-0 kubenswrapper[17644]: I0319 12:22:46.469901 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-84vlf/must-gather-ng7g8"] Mar 19 12:22:46.474540 master-0 kubenswrapper[17644]: I0319 12:22:46.474438 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-84vlf/must-gather-ng7g8" Mar 19 12:22:46.477113 master-0 kubenswrapper[17644]: I0319 12:22:46.477064 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-84vlf"/"openshift-service-ca.crt" Mar 19 12:22:46.477511 master-0 kubenswrapper[17644]: I0319 12:22:46.477449 17644 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-84vlf"/"kube-root-ca.crt" Mar 19 12:22:46.480961 master-0 kubenswrapper[17644]: I0319 12:22:46.480895 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-84vlf/must-gather-r5msw"] Mar 19 12:22:46.522386 master-0 kubenswrapper[17644]: I0319 12:22:46.520170 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-84vlf/must-gather-ng7g8"] Mar 19 12:22:46.527457 master-0 kubenswrapper[17644]: I0319 12:22:46.527069 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfcbv\" (UniqueName: \"kubernetes.io/projected/18b48b71-6c25-4744-b9dc-cea2d319efc5-kube-api-access-kfcbv\") pod \"must-gather-r5msw\" (UID: \"18b48b71-6c25-4744-b9dc-cea2d319efc5\") " pod="openshift-must-gather-84vlf/must-gather-r5msw" Mar 19 12:22:46.527457 master-0 kubenswrapper[17644]: I0319 12:22:46.527132 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4x4g\" (UniqueName: \"kubernetes.io/projected/e5e2abef-cfe9-4660-b4ca-ad9eb94ed1e6-kube-api-access-f4x4g\") pod \"must-gather-ng7g8\" (UID: \"e5e2abef-cfe9-4660-b4ca-ad9eb94ed1e6\") " pod="openshift-must-gather-84vlf/must-gather-ng7g8" Mar 19 12:22:46.527457 master-0 kubenswrapper[17644]: I0319 12:22:46.527202 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/18b48b71-6c25-4744-b9dc-cea2d319efc5-must-gather-output\") pod \"must-gather-r5msw\" (UID: \"18b48b71-6c25-4744-b9dc-cea2d319efc5\") " pod="openshift-must-gather-84vlf/must-gather-r5msw" Mar 19 12:22:46.527457 master-0 kubenswrapper[17644]: I0319 12:22:46.527243 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e5e2abef-cfe9-4660-b4ca-ad9eb94ed1e6-must-gather-output\") pod \"must-gather-ng7g8\" (UID: \"e5e2abef-cfe9-4660-b4ca-ad9eb94ed1e6\") " pod="openshift-must-gather-84vlf/must-gather-ng7g8" Mar 19 12:22:46.628914 master-0 kubenswrapper[17644]: I0319 12:22:46.628872 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/18b48b71-6c25-4744-b9dc-cea2d319efc5-must-gather-output\") pod \"must-gather-r5msw\" (UID: \"18b48b71-6c25-4744-b9dc-cea2d319efc5\") " pod="openshift-must-gather-84vlf/must-gather-r5msw" Mar 19 12:22:46.629236 master-0 kubenswrapper[17644]: I0319 12:22:46.629218 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e5e2abef-cfe9-4660-b4ca-ad9eb94ed1e6-must-gather-output\") pod \"must-gather-ng7g8\" (UID: \"e5e2abef-cfe9-4660-b4ca-ad9eb94ed1e6\") " pod="openshift-must-gather-84vlf/must-gather-ng7g8" Mar 19 12:22:46.629408 master-0 kubenswrapper[17644]: I0319 12:22:46.629391 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfcbv\" (UniqueName: \"kubernetes.io/projected/18b48b71-6c25-4744-b9dc-cea2d319efc5-kube-api-access-kfcbv\") pod \"must-gather-r5msw\" (UID: \"18b48b71-6c25-4744-b9dc-cea2d319efc5\") " pod="openshift-must-gather-84vlf/must-gather-r5msw" Mar 19 12:22:46.629634 master-0 kubenswrapper[17644]: I0319 12:22:46.629593 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/18b48b71-6c25-4744-b9dc-cea2d319efc5-must-gather-output\") pod \"must-gather-r5msw\" (UID: \"18b48b71-6c25-4744-b9dc-cea2d319efc5\") " pod="openshift-must-gather-84vlf/must-gather-r5msw" Mar 19 12:22:46.629763 master-0 kubenswrapper[17644]: I0319 12:22:46.629744 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4x4g\" (UniqueName: \"kubernetes.io/projected/e5e2abef-cfe9-4660-b4ca-ad9eb94ed1e6-kube-api-access-f4x4g\") pod \"must-gather-ng7g8\" (UID: \"e5e2abef-cfe9-4660-b4ca-ad9eb94ed1e6\") " pod="openshift-must-gather-84vlf/must-gather-ng7g8" Mar 19 12:22:46.630197 master-0 kubenswrapper[17644]: I0319 12:22:46.630143 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e5e2abef-cfe9-4660-b4ca-ad9eb94ed1e6-must-gather-output\") pod \"must-gather-ng7g8\" (UID: \"e5e2abef-cfe9-4660-b4ca-ad9eb94ed1e6\") " pod="openshift-must-gather-84vlf/must-gather-ng7g8" Mar 19 12:22:46.648337 master-0 kubenswrapper[17644]: I0319 12:22:46.647214 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4x4g\" (UniqueName: \"kubernetes.io/projected/e5e2abef-cfe9-4660-b4ca-ad9eb94ed1e6-kube-api-access-f4x4g\") pod \"must-gather-ng7g8\" (UID: \"e5e2abef-cfe9-4660-b4ca-ad9eb94ed1e6\") " pod="openshift-must-gather-84vlf/must-gather-ng7g8" Mar 19 12:22:46.649443 master-0 kubenswrapper[17644]: I0319 12:22:46.649401 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfcbv\" (UniqueName: \"kubernetes.io/projected/18b48b71-6c25-4744-b9dc-cea2d319efc5-kube-api-access-kfcbv\") pod \"must-gather-r5msw\" (UID: \"18b48b71-6c25-4744-b9dc-cea2d319efc5\") " pod="openshift-must-gather-84vlf/must-gather-r5msw" Mar 19 12:22:46.794093 master-0 kubenswrapper[17644]: I0319 12:22:46.793891 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-84vlf/must-gather-r5msw" Mar 19 12:22:46.832593 master-0 kubenswrapper[17644]: I0319 12:22:46.832513 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-84vlf/must-gather-ng7g8" Mar 19 12:22:47.301940 master-0 kubenswrapper[17644]: I0319 12:22:47.301858 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-84vlf/must-gather-r5msw"] Mar 19 12:22:47.305046 master-0 kubenswrapper[17644]: I0319 12:22:47.305014 17644 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 12:22:47.372036 master-0 kubenswrapper[17644]: W0319 12:22:47.371951 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5e2abef_cfe9_4660_b4ca_ad9eb94ed1e6.slice/crio-6536e637501f0b26124be34a1fc187bd9cf4a2a93bdcaec68aa275630476303f WatchSource:0}: Error finding container 6536e637501f0b26124be34a1fc187bd9cf4a2a93bdcaec68aa275630476303f: Status 404 returned error can't find the container with id 6536e637501f0b26124be34a1fc187bd9cf4a2a93bdcaec68aa275630476303f Mar 19 12:22:47.376529 master-0 kubenswrapper[17644]: I0319 12:22:47.376437 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-84vlf/must-gather-ng7g8"] Mar 19 12:22:47.438317 master-0 kubenswrapper[17644]: I0319 12:22:47.438263 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-84vlf/must-gather-ng7g8" event={"ID":"e5e2abef-cfe9-4660-b4ca-ad9eb94ed1e6","Type":"ContainerStarted","Data":"6536e637501f0b26124be34a1fc187bd9cf4a2a93bdcaec68aa275630476303f"} Mar 19 12:22:47.439805 master-0 kubenswrapper[17644]: I0319 12:22:47.439770 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-84vlf/must-gather-r5msw" event={"ID":"18b48b71-6c25-4744-b9dc-cea2d319efc5","Type":"ContainerStarted","Data":"ffda57616d083b613908f6ff25b7094874dff208d1778b0705fbc164c1eb4a90"} Mar 19 12:22:49.466718 master-0 kubenswrapper[17644]: I0319 12:22:49.466639 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-84vlf/must-gather-ng7g8" event={"ID":"e5e2abef-cfe9-4660-b4ca-ad9eb94ed1e6","Type":"ContainerStarted","Data":"4b15e64cd0f5958177934769720d80ff4e60742ff32816d6eeaa22b6ffde6db7"} Mar 19 12:22:49.466718 master-0 kubenswrapper[17644]: I0319 12:22:49.466703 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-84vlf/must-gather-ng7g8" event={"ID":"e5e2abef-cfe9-4660-b4ca-ad9eb94ed1e6","Type":"ContainerStarted","Data":"2363abd6a6c290b49449aee2bac265f2a944a7c4fd90cba15fc15020129706ad"} Mar 19 12:22:49.496277 master-0 kubenswrapper[17644]: I0319 12:22:49.496168 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-84vlf/must-gather-ng7g8" podStartSLOduration=2.408991782 podStartE2EDuration="3.496090835s" podCreationTimestamp="2026-03-19 12:22:46 +0000 UTC" firstStartedPulling="2026-03-19 12:22:47.374620528 +0000 UTC m=+1401.144578573" lastFinishedPulling="2026-03-19 12:22:48.461719591 +0000 UTC m=+1402.231677626" observedRunningTime="2026-03-19 12:22:49.488231211 +0000 UTC m=+1403.258189256" watchObservedRunningTime="2026-03-19 12:22:49.496090835 +0000 UTC m=+1403.266048870" Mar 19 12:22:52.763804 master-0 kubenswrapper[17644]: I0319 12:22:52.763672 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-7d58488df-rcbf8_dd6ec279-d92f-45c2-97c2-88b96fbd6600/cluster-version-operator/0.log" Mar 19 12:22:53.887768 master-0 kubenswrapper[17644]: I0319 12:22:53.887663 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-46b4x_304c0dcc-9c40-4bf4-9c05-9d1a4601b15c/controller/0.log" Mar 19 12:22:53.895254 master-0 kubenswrapper[17644]: I0319 12:22:53.895104 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-46b4x_304c0dcc-9c40-4bf4-9c05-9d1a4601b15c/kube-rbac-proxy/0.log" Mar 19 12:22:53.957304 master-0 kubenswrapper[17644]: I0319 12:22:53.957093 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/controller/0.log" Mar 19 12:22:54.002764 master-0 kubenswrapper[17644]: I0319 12:22:54.002599 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/frr/0.log" Mar 19 12:22:54.016296 master-0 kubenswrapper[17644]: I0319 12:22:54.016215 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/reloader/0.log" Mar 19 12:22:54.026078 master-0 kubenswrapper[17644]: I0319 12:22:54.026038 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/frr-metrics/0.log" Mar 19 12:22:54.049644 master-0 kubenswrapper[17644]: I0319 12:22:54.047787 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/kube-rbac-proxy/0.log" Mar 19 12:22:54.058542 master-0 kubenswrapper[17644]: I0319 12:22:54.058474 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/kube-rbac-proxy-frr/0.log" Mar 19 12:22:54.073973 master-0 kubenswrapper[17644]: I0319 12:22:54.073933 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/cp-frr-files/0.log" Mar 19 12:22:54.092600 master-0 kubenswrapper[17644]: I0319 12:22:54.092531 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/cp-reloader/0.log" Mar 19 12:22:54.110087 master-0 kubenswrapper[17644]: I0319 12:22:54.109984 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/cp-metrics/0.log" Mar 19 12:22:54.137314 master-0 kubenswrapper[17644]: I0319 12:22:54.136347 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-rj2bt_8c5e6757-8626-4cd1-8736-b41978d173f1/frr-k8s-webhook-server/0.log" Mar 19 12:22:54.177290 master-0 kubenswrapper[17644]: I0319 12:22:54.177207 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7c8d7d7bcf-bpkwf_7bed1df9-51fd-4f70-95e9-4ec7333995d1/manager/0.log" Mar 19 12:22:54.198754 master-0 kubenswrapper[17644]: I0319 12:22:54.195578 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5b47fbc9b4-zf5jl_e87720f9-bdd2-4397-808c-b51869af7cfe/webhook-server/0.log" Mar 19 12:22:54.225410 master-0 kubenswrapper[17644]: I0319 12:22:54.225374 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-5v79v_bda90bb9-a85d-4dba-b00b-7721557694bc/nmstate-console-plugin/0.log" Mar 19 12:22:54.284779 master-0 kubenswrapper[17644]: I0319 12:22:54.284082 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-pbgxp_e6f7e8f5-8cca-400e-9eea-2961d3f9920f/nmstate-handler/0.log" Mar 19 12:22:54.302801 master-0 kubenswrapper[17644]: I0319 12:22:54.301928 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-q82hx_63931868-d8f1-4227-b911-81d786835fbb/speaker/0.log" Mar 19 12:22:54.311748 master-0 kubenswrapper[17644]: I0319 12:22:54.311673 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-7hz8v_defe43e5-1621-42e7-9e79-bc48c2bbfb5c/nmstate-metrics/0.log" Mar 19 12:22:54.315704 master-0 kubenswrapper[17644]: I0319 12:22:54.315652 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-q82hx_63931868-d8f1-4227-b911-81d786835fbb/kube-rbac-proxy/0.log" Mar 19 12:22:54.326194 master-0 kubenswrapper[17644]: I0319 12:22:54.326127 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-7hz8v_defe43e5-1621-42e7-9e79-bc48c2bbfb5c/kube-rbac-proxy/0.log" Mar 19 12:22:54.354469 master-0 kubenswrapper[17644]: I0319 12:22:54.354132 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-8ht49_8de64a53-181c-4b60-a814-c8f104593009/nmstate-operator/0.log" Mar 19 12:22:54.381416 master-0 kubenswrapper[17644]: I0319 12:22:54.381377 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-94bfp_8d8c7433-5218-4713-9e76-1c94175acd1c/nmstate-webhook/0.log" Mar 19 12:22:54.669803 master-0 kubenswrapper[17644]: E0319 12:22:54.668418 17644 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.32.10:35684->192.168.32.10:46843: write tcp 192.168.32.10:35684->192.168.32.10:46843: write: broken pipe Mar 19 12:22:57.357751 master-0 kubenswrapper[17644]: I0319 12:22:57.356714 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-65dbcfd7b7-qq8lc_282960d1-08a2-4187-8279-2081bfdda059/oauth-openshift/0.log" Mar 19 12:22:57.858171 master-0 kubenswrapper[17644]: I0319 12:22:57.857665 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcdctl/0.log" Mar 19 12:22:57.924528 master-0 kubenswrapper[17644]: I0319 12:22:57.924453 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd/0.log" Mar 19 12:22:57.948185 master-0 kubenswrapper[17644]: I0319 12:22:57.948109 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-metrics/0.log" Mar 19 12:22:57.986322 master-0 kubenswrapper[17644]: I0319 12:22:57.980149 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-readyz/0.log" Mar 19 12:22:58.012832 master-0 kubenswrapper[17644]: I0319 12:22:58.008299 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-rev/0.log" Mar 19 12:22:58.027231 master-0 kubenswrapper[17644]: I0319 12:22:58.027166 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/setup/0.log" Mar 19 12:22:58.050756 master-0 kubenswrapper[17644]: I0319 12:22:58.048350 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-ensure-env-vars/0.log" Mar 19 12:22:58.071114 master-0 kubenswrapper[17644]: I0319 12:22:58.071028 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-resources-copy/0.log" Mar 19 12:22:58.131366 master-0 kubenswrapper[17644]: I0319 12:22:58.130060 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_6bde080b-3820-463f-a27d-9fb9a7843d5d/installer/0.log" Mar 19 12:22:58.192774 master-0 kubenswrapper[17644]: I0319 12:22:58.185163 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-master-0_4ec000c4-5cc8-45b3-95ba-2856655f02f5/installer/0.log" Mar 19 12:22:58.625759 master-0 kubenswrapper[17644]: I0319 12:22:58.625695 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5885bfd7f4-gqd94_732989c5-1b89-46f0-9917-b68613f7f005/authentication-operator/0.log" Mar 19 12:22:58.687773 master-0 kubenswrapper[17644]: I0319 12:22:58.686533 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5885bfd7f4-gqd94_732989c5-1b89-46f0-9917-b68613f7f005/authentication-operator/1.log" Mar 19 12:22:59.041134 master-0 kubenswrapper[17644]: I0319 12:22:59.039265 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/assisted-installer_assisted-installer-controller-48bcp_c13ffb3e-ab50-411c-9208-7ba47e8ebc92/assisted-installer-controller/0.log" Mar 19 12:22:59.701779 master-0 kubenswrapper[17644]: I0319 12:22:59.701697 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7dcf5569b5-kpmgt_e2ad29ad-70ef-43c6-91f6-02f04d145673/router/0.log" Mar 19 12:23:00.366176 master-0 kubenswrapper[17644]: I0319 12:23:00.366130 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-899bc59d8-xxr9r_e45616db-f7dd-4a08-847f-abf2759d9fa4/oauth-apiserver/0.log" Mar 19 12:23:00.384895 master-0 kubenswrapper[17644]: I0319 12:23:00.384252 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-899bc59d8-xxr9r_e45616db-f7dd-4a08-847f-abf2759d9fa4/fix-audit-permissions/0.log" Mar 19 12:23:00.645199 master-0 kubenswrapper[17644]: I0319 12:23:00.645051 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-84vlf/must-gather-r5msw" event={"ID":"18b48b71-6c25-4744-b9dc-cea2d319efc5","Type":"ContainerStarted","Data":"64646e4835c8e9fe75a3ab4b28f2a31d27fba73fc22e5c4a890cea90c8898dcf"} Mar 19 12:23:00.990436 master-0 kubenswrapper[17644]: I0319 12:23:00.990142 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-866dc4744-dnx7f_ac09dba7-398c-4b0a-a415-edb73cb4cf30/kube-rbac-proxy/0.log" Mar 19 12:23:01.030156 master-0 kubenswrapper[17644]: I0319 12:23:01.029958 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-866dc4744-dnx7f_ac09dba7-398c-4b0a-a415-edb73cb4cf30/cluster-autoscaler-operator/0.log" Mar 19 12:23:01.049001 master-0 kubenswrapper[17644]: I0319 12:23:01.048951 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-942g6_92e401a4-ed2f-46f7-924b-329d7b313e6a/cluster-baremetal-operator/1.log" Mar 19 12:23:01.049786 master-0 kubenswrapper[17644]: I0319 12:23:01.049757 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-942g6_92e401a4-ed2f-46f7-924b-329d7b313e6a/cluster-baremetal-operator/2.log" Mar 19 12:23:01.061292 master-0 kubenswrapper[17644]: I0319 12:23:01.061257 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-942g6_92e401a4-ed2f-46f7-924b-329d7b313e6a/baremetal-kube-rbac-proxy/0.log" Mar 19 12:23:01.085571 master-0 kubenswrapper[17644]: I0319 12:23:01.085505 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6f97756bc8-j7rc9_7a51eeaf-1349-4bf3-932d-22ed5ce7c161/control-plane-machine-set-operator/1.log" Mar 19 12:23:01.086391 master-0 kubenswrapper[17644]: I0319 12:23:01.086369 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6f97756bc8-j7rc9_7a51eeaf-1349-4bf3-932d-22ed5ce7c161/control-plane-machine-set-operator/0.log" Mar 19 12:23:01.115711 master-0 kubenswrapper[17644]: I0319 12:23:01.115644 17644 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-84vlf/perf-node-gather-daemonset-2czqc"] Mar 19 12:23:01.117324 master-0 kubenswrapper[17644]: I0319 12:23:01.117291 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-84vlf/perf-node-gather-daemonset-2czqc" Mar 19 12:23:01.118598 master-0 kubenswrapper[17644]: I0319 12:23:01.118555 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-6fbb6cf6f9-jf7p6_75aedbcd-f6ed-43a1-941b-4b04887ffe8e/kube-rbac-proxy/0.log" Mar 19 12:23:01.140313 master-0 kubenswrapper[17644]: I0319 12:23:01.140261 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-84vlf/perf-node-gather-daemonset-2czqc"] Mar 19 12:23:01.169891 master-0 kubenswrapper[17644]: I0319 12:23:01.169389 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-6fbb6cf6f9-jf7p6_75aedbcd-f6ed-43a1-941b-4b04887ffe8e/machine-api-operator/0.log" Mar 19 12:23:01.221540 master-0 kubenswrapper[17644]: I0319 12:23:01.221421 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/449f610c-57b6-4484-b506-7a7e0d3bb11f-proc\") pod \"perf-node-gather-daemonset-2czqc\" (UID: \"449f610c-57b6-4484-b506-7a7e0d3bb11f\") " pod="openshift-must-gather-84vlf/perf-node-gather-daemonset-2czqc" Mar 19 12:23:01.221805 master-0 kubenswrapper[17644]: I0319 12:23:01.221556 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/449f610c-57b6-4484-b506-7a7e0d3bb11f-podres\") pod \"perf-node-gather-daemonset-2czqc\" (UID: \"449f610c-57b6-4484-b506-7a7e0d3bb11f\") " pod="openshift-must-gather-84vlf/perf-node-gather-daemonset-2czqc" Mar 19 12:23:01.221805 master-0 kubenswrapper[17644]: I0319 12:23:01.221685 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkbxh\" (UniqueName: \"kubernetes.io/projected/449f610c-57b6-4484-b506-7a7e0d3bb11f-kube-api-access-qkbxh\") pod \"perf-node-gather-daemonset-2czqc\" (UID: \"449f610c-57b6-4484-b506-7a7e0d3bb11f\") " pod="openshift-must-gather-84vlf/perf-node-gather-daemonset-2czqc" Mar 19 12:23:01.221805 master-0 kubenswrapper[17644]: I0319 12:23:01.221780 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/449f610c-57b6-4484-b506-7a7e0d3bb11f-lib-modules\") pod \"perf-node-gather-daemonset-2czqc\" (UID: \"449f610c-57b6-4484-b506-7a7e0d3bb11f\") " pod="openshift-must-gather-84vlf/perf-node-gather-daemonset-2czqc" Mar 19 12:23:01.221954 master-0 kubenswrapper[17644]: I0319 12:23:01.221829 17644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/449f610c-57b6-4484-b506-7a7e0d3bb11f-sys\") pod \"perf-node-gather-daemonset-2czqc\" (UID: \"449f610c-57b6-4484-b506-7a7e0d3bb11f\") " pod="openshift-must-gather-84vlf/perf-node-gather-daemonset-2czqc" Mar 19 12:23:01.323598 master-0 kubenswrapper[17644]: I0319 12:23:01.323442 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/449f610c-57b6-4484-b506-7a7e0d3bb11f-proc\") pod \"perf-node-gather-daemonset-2czqc\" (UID: \"449f610c-57b6-4484-b506-7a7e0d3bb11f\") " pod="openshift-must-gather-84vlf/perf-node-gather-daemonset-2czqc" Mar 19 12:23:01.323800 master-0 kubenswrapper[17644]: I0319 12:23:01.323598 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/449f610c-57b6-4484-b506-7a7e0d3bb11f-proc\") pod \"perf-node-gather-daemonset-2czqc\" (UID: \"449f610c-57b6-4484-b506-7a7e0d3bb11f\") " pod="openshift-must-gather-84vlf/perf-node-gather-daemonset-2czqc" Mar 19 12:23:01.323800 master-0 kubenswrapper[17644]: I0319 12:23:01.323608 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/449f610c-57b6-4484-b506-7a7e0d3bb11f-podres\") pod \"perf-node-gather-daemonset-2czqc\" (UID: \"449f610c-57b6-4484-b506-7a7e0d3bb11f\") " pod="openshift-must-gather-84vlf/perf-node-gather-daemonset-2czqc" Mar 19 12:23:01.323800 master-0 kubenswrapper[17644]: I0319 12:23:01.323698 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkbxh\" (UniqueName: \"kubernetes.io/projected/449f610c-57b6-4484-b506-7a7e0d3bb11f-kube-api-access-qkbxh\") pod \"perf-node-gather-daemonset-2czqc\" (UID: \"449f610c-57b6-4484-b506-7a7e0d3bb11f\") " pod="openshift-must-gather-84vlf/perf-node-gather-daemonset-2czqc" Mar 19 12:23:01.323800 master-0 kubenswrapper[17644]: I0319 12:23:01.323760 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/449f610c-57b6-4484-b506-7a7e0d3bb11f-podres\") pod \"perf-node-gather-daemonset-2czqc\" (UID: \"449f610c-57b6-4484-b506-7a7e0d3bb11f\") " pod="openshift-must-gather-84vlf/perf-node-gather-daemonset-2czqc" Mar 19 12:23:01.323947 master-0 kubenswrapper[17644]: I0319 12:23:01.323804 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/449f610c-57b6-4484-b506-7a7e0d3bb11f-lib-modules\") pod \"perf-node-gather-daemonset-2czqc\" (UID: \"449f610c-57b6-4484-b506-7a7e0d3bb11f\") " pod="openshift-must-gather-84vlf/perf-node-gather-daemonset-2czqc" Mar 19 12:23:01.323988 master-0 kubenswrapper[17644]: I0319 12:23:01.323933 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/449f610c-57b6-4484-b506-7a7e0d3bb11f-lib-modules\") pod \"perf-node-gather-daemonset-2czqc\" (UID: \"449f610c-57b6-4484-b506-7a7e0d3bb11f\") " pod="openshift-must-gather-84vlf/perf-node-gather-daemonset-2czqc" Mar 19 12:23:01.323988 master-0 kubenswrapper[17644]: I0319 12:23:01.323953 17644 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/449f610c-57b6-4484-b506-7a7e0d3bb11f-sys\") pod \"perf-node-gather-daemonset-2czqc\" (UID: \"449f610c-57b6-4484-b506-7a7e0d3bb11f\") " pod="openshift-must-gather-84vlf/perf-node-gather-daemonset-2czqc" Mar 19 12:23:01.324073 master-0 kubenswrapper[17644]: I0319 12:23:01.324000 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/449f610c-57b6-4484-b506-7a7e0d3bb11f-sys\") pod \"perf-node-gather-daemonset-2czqc\" (UID: \"449f610c-57b6-4484-b506-7a7e0d3bb11f\") " pod="openshift-must-gather-84vlf/perf-node-gather-daemonset-2czqc" Mar 19 12:23:01.341543 master-0 kubenswrapper[17644]: I0319 12:23:01.341479 17644 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkbxh\" (UniqueName: \"kubernetes.io/projected/449f610c-57b6-4484-b506-7a7e0d3bb11f-kube-api-access-qkbxh\") pod \"perf-node-gather-daemonset-2czqc\" (UID: \"449f610c-57b6-4484-b506-7a7e0d3bb11f\") " pod="openshift-must-gather-84vlf/perf-node-gather-daemonset-2czqc" Mar 19 12:23:01.435981 master-0 kubenswrapper[17644]: I0319 12:23:01.435911 17644 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-84vlf/perf-node-gather-daemonset-2czqc" Mar 19 12:23:01.659083 master-0 kubenswrapper[17644]: I0319 12:23:01.658171 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-84vlf/must-gather-r5msw" event={"ID":"18b48b71-6c25-4744-b9dc-cea2d319efc5","Type":"ContainerStarted","Data":"a6a0cadf0169db2d7206e81b7c0fd291bb761abaff0d51d50c1261b65200ee71"} Mar 19 12:23:01.699596 master-0 kubenswrapper[17644]: I0319 12:23:01.698089 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-84vlf/must-gather-r5msw" podStartSLOduration=2.87003784 podStartE2EDuration="15.698070658s" podCreationTimestamp="2026-03-19 12:22:46 +0000 UTC" firstStartedPulling="2026-03-19 12:22:47.304962378 +0000 UTC m=+1401.074920423" lastFinishedPulling="2026-03-19 12:23:00.132995206 +0000 UTC m=+1413.902953241" observedRunningTime="2026-03-19 12:23:01.692482669 +0000 UTC m=+1415.462440724" watchObservedRunningTime="2026-03-19 12:23:01.698070658 +0000 UTC m=+1415.468028693" Mar 19 12:23:01.884901 master-0 kubenswrapper[17644]: W0319 12:23:01.884814 17644 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod449f610c_57b6_4484_b506_7a7e0d3bb11f.slice/crio-c2660765673be7a22f55e63b4961ae4150a32b37e96471cd7f129e5d28476edf WatchSource:0}: Error finding container c2660765673be7a22f55e63b4961ae4150a32b37e96471cd7f129e5d28476edf: Status 404 returned error can't find the container with id c2660765673be7a22f55e63b4961ae4150a32b37e96471cd7f129e5d28476edf Mar 19 12:23:01.887878 master-0 kubenswrapper[17644]: I0319 12:23:01.887812 17644 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-84vlf/perf-node-gather-daemonset-2czqc"] Mar 19 12:23:02.209855 master-0 kubenswrapper[17644]: I0319 12:23:02.209780 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-87z86_c8d8a09f-22d5-4f16-84d6-d5f2c504c949/cluster-cloud-controller-manager/1.log" Mar 19 12:23:02.211039 master-0 kubenswrapper[17644]: I0319 12:23:02.210988 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-87z86_c8d8a09f-22d5-4f16-84d6-d5f2c504c949/cluster-cloud-controller-manager/0.log" Mar 19 12:23:02.226390 master-0 kubenswrapper[17644]: I0319 12:23:02.226317 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-87z86_c8d8a09f-22d5-4f16-84d6-d5f2c504c949/config-sync-controllers/1.log" Mar 19 12:23:02.227273 master-0 kubenswrapper[17644]: I0319 12:23:02.227233 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-87z86_c8d8a09f-22d5-4f16-84d6-d5f2c504c949/config-sync-controllers/0.log" Mar 19 12:23:02.241651 master-0 kubenswrapper[17644]: I0319 12:23:02.241581 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-87z86_c8d8a09f-22d5-4f16-84d6-d5f2c504c949/kube-rbac-proxy/0.log" Mar 19 12:23:02.669381 master-0 kubenswrapper[17644]: I0319 12:23:02.669239 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-84vlf/perf-node-gather-daemonset-2czqc" event={"ID":"449f610c-57b6-4484-b506-7a7e0d3bb11f","Type":"ContainerStarted","Data":"6d88e1db2148ef45684b168c488db21eed110b9dd5ef6c401cdd6387bf5e2f36"} Mar 19 12:23:02.669381 master-0 kubenswrapper[17644]: I0319 12:23:02.669338 17644 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-84vlf/perf-node-gather-daemonset-2czqc" event={"ID":"449f610c-57b6-4484-b506-7a7e0d3bb11f","Type":"ContainerStarted","Data":"c2660765673be7a22f55e63b4961ae4150a32b37e96471cd7f129e5d28476edf"} Mar 19 12:23:02.817413 master-0 kubenswrapper[17644]: I0319 12:23:02.817330 17644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-84vlf/perf-node-gather-daemonset-2czqc" podStartSLOduration=1.817310789 podStartE2EDuration="1.817310789s" podCreationTimestamp="2026-03-19 12:23:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:23:02.811092725 +0000 UTC m=+1416.581050770" watchObservedRunningTime="2026-03-19 12:23:02.817310789 +0000 UTC m=+1416.587268824" Mar 19 12:23:03.677198 master-0 kubenswrapper[17644]: I0319 12:23:03.677127 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-must-gather-84vlf/perf-node-gather-daemonset-2czqc" Mar 19 12:23:03.816950 master-0 kubenswrapper[17644]: I0319 12:23:03.816888 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-credential-operator_cloud-credential-operator-744f9dbf77-mhvls_0cbbe8d0-aafb-499f-a1f4-affcea62c1ab/kube-rbac-proxy/0.log" Mar 19 12:23:03.853851 master-0 kubenswrapper[17644]: I0319 12:23:03.853778 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-credential-operator_cloud-credential-operator-744f9dbf77-mhvls_0cbbe8d0-aafb-499f-a1f4-affcea62c1ab/cloud-credential-operator/0.log" Mar 19 12:23:05.210524 master-0 kubenswrapper[17644]: I0319 12:23:05.210461 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-95bf4f4d-ng9ss_a3ceeece-bee9-4fcb-8517-95ebce38e223/openshift-config-operator/2.log" Mar 19 12:23:05.213065 master-0 kubenswrapper[17644]: I0319 12:23:05.213012 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-95bf4f4d-ng9ss_a3ceeece-bee9-4fcb-8517-95ebce38e223/openshift-config-operator/3.log" Mar 19 12:23:05.232755 master-0 kubenswrapper[17644]: I0319 12:23:05.232295 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-95bf4f4d-ng9ss_a3ceeece-bee9-4fcb-8517-95ebce38e223/openshift-api/0.log" Mar 19 12:23:05.912873 master-0 kubenswrapper[17644]: I0319 12:23:05.912635 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b6568d85-8bvjj_d2fd7597-cd7a-4138-bb3c-01681c569bd3/console-operator/0.log" Mar 19 12:23:06.513601 master-0 kubenswrapper[17644]: I0319 12:23:06.513554 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-46b4x_304c0dcc-9c40-4bf4-9c05-9d1a4601b15c/controller/0.log" Mar 19 12:23:06.518463 master-0 kubenswrapper[17644]: I0319 12:23:06.518331 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-46b4x_304c0dcc-9c40-4bf4-9c05-9d1a4601b15c/kube-rbac-proxy/0.log" Mar 19 12:23:06.538623 master-0 kubenswrapper[17644]: I0319 12:23:06.538577 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/controller/0.log" Mar 19 12:23:06.549195 master-0 kubenswrapper[17644]: I0319 12:23:06.549140 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-657ddb758d-h2zns_db5ffad0-a78e-4f97-a915-39bf347b53ca/console/0.log" Mar 19 12:23:06.583997 master-0 kubenswrapper[17644]: I0319 12:23:06.583941 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-66b8ffb895-vqnnc_e17d22fe-fe0f-448e-9666-882d888d3ad4/download-server/0.log" Mar 19 12:23:06.589517 master-0 kubenswrapper[17644]: I0319 12:23:06.589483 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/frr/0.log" Mar 19 12:23:06.599470 master-0 kubenswrapper[17644]: I0319 12:23:06.598441 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/reloader/0.log" Mar 19 12:23:06.603870 master-0 kubenswrapper[17644]: I0319 12:23:06.603628 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/frr-metrics/0.log" Mar 19 12:23:06.611529 master-0 kubenswrapper[17644]: I0319 12:23:06.611481 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/kube-rbac-proxy/0.log" Mar 19 12:23:06.618236 master-0 kubenswrapper[17644]: I0319 12:23:06.618172 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/kube-rbac-proxy-frr/0.log" Mar 19 12:23:06.625528 master-0 kubenswrapper[17644]: I0319 12:23:06.625483 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/cp-frr-files/0.log" Mar 19 12:23:06.635221 master-0 kubenswrapper[17644]: I0319 12:23:06.635177 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/cp-reloader/0.log" Mar 19 12:23:06.642324 master-0 kubenswrapper[17644]: I0319 12:23:06.642284 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/cp-metrics/0.log" Mar 19 12:23:06.655822 master-0 kubenswrapper[17644]: I0319 12:23:06.655768 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-rj2bt_8c5e6757-8626-4cd1-8736-b41978d173f1/frr-k8s-webhook-server/0.log" Mar 19 12:23:06.687522 master-0 kubenswrapper[17644]: I0319 12:23:06.687458 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7c8d7d7bcf-bpkwf_7bed1df9-51fd-4f70-95e9-4ec7333995d1/manager/0.log" Mar 19 12:23:06.712171 master-0 kubenswrapper[17644]: I0319 12:23:06.712105 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5b47fbc9b4-zf5jl_e87720f9-bdd2-4397-808c-b51869af7cfe/webhook-server/0.log" Mar 19 12:23:06.809180 master-0 kubenswrapper[17644]: I0319 12:23:06.809062 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-q82hx_63931868-d8f1-4227-b911-81d786835fbb/speaker/0.log" Mar 19 12:23:06.815968 master-0 kubenswrapper[17644]: I0319 12:23:06.815921 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-q82hx_63931868-d8f1-4227-b911-81d786835fbb/kube-rbac-proxy/0.log" Mar 19 12:23:07.467377 master-0 kubenswrapper[17644]: I0319 12:23:07.467316 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-7d87854d6-htdhf_52bdf7cc-f07d-487e-937c-6567f194947e/cluster-storage-operator/0.log" Mar 19 12:23:07.492771 master-0 kubenswrapper[17644]: I0319 12:23:07.492689 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-764k4_d625c81e-01cc-424a-997d-546a5204a72b/snapshot-controller/4.log" Mar 19 12:23:07.493272 master-0 kubenswrapper[17644]: I0319 12:23:07.493210 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-764k4_d625c81e-01cc-424a-997d-546a5204a72b/snapshot-controller/5.log" Mar 19 12:23:07.524211 master-0 kubenswrapper[17644]: I0319 12:23:07.524143 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-5f5d689c6b-fx8ng_2292109e-92a9-4286-858e-dcd2ac083c43/csi-snapshot-controller-operator/0.log" Mar 19 12:23:08.146470 master-0 kubenswrapper[17644]: I0319 12:23:08.146426 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns-operator_dns-operator-9c5679d8f-965np_22e10648-af7c-409e-b947-570e7d807e05/dns-operator/0.log" Mar 19 12:23:08.158868 master-0 kubenswrapper[17644]: I0319 12:23:08.158808 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns-operator_dns-operator-9c5679d8f-965np_22e10648-af7c-409e-b947-570e7d807e05/kube-rbac-proxy/0.log" Mar 19 12:23:08.683293 master-0 kubenswrapper[17644]: I0319 12:23:08.683246 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ztgjs_1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c/dns/0.log" Mar 19 12:23:08.703439 master-0 kubenswrapper[17644]: I0319 12:23:08.703401 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ztgjs_1e8cbab5-01c5-4f58-9a06-41e7c4c68c1c/kube-rbac-proxy/0.log" Mar 19 12:23:08.718811 master-0 kubenswrapper[17644]: I0319 12:23:08.718764 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pm77f_1c898657-f06b-44ab-95ff-53a324759ba1/dns-node-resolver/0.log" Mar 19 12:23:09.053520 master-0 kubenswrapper[17644]: I0319 12:23:09.053405 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-76j62_aff4c24a-63b7-44ea-86de-c543b1afd15f/registry-server/0.log" Mar 19 12:23:09.381090 master-0 kubenswrapper[17644]: I0319 12:23:09.380942 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-8544cbcf9c-9w7hc_8fe4839d-cef4-4ec9-b146-2ae9b76d8a76/etcd-operator/1.log" Mar 19 12:23:09.388610 master-0 kubenswrapper[17644]: I0319 12:23:09.388555 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-8544cbcf9c-9w7hc_8fe4839d-cef4-4ec9-b146-2ae9b76d8a76/etcd-operator/0.log" Mar 19 12:23:10.036892 master-0 kubenswrapper[17644]: I0319 12:23:10.036834 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcdctl/0.log" Mar 19 12:23:10.099946 master-0 kubenswrapper[17644]: I0319 12:23:10.099883 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd/0.log" Mar 19 12:23:10.111760 master-0 kubenswrapper[17644]: I0319 12:23:10.111708 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-metrics/0.log" Mar 19 12:23:10.123308 master-0 kubenswrapper[17644]: I0319 12:23:10.123250 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-readyz/0.log" Mar 19 12:23:10.139684 master-0 kubenswrapper[17644]: I0319 12:23:10.139616 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-rev/0.log" Mar 19 12:23:10.152564 master-0 kubenswrapper[17644]: I0319 12:23:10.152479 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/setup/0.log" Mar 19 12:23:10.167504 master-0 kubenswrapper[17644]: I0319 12:23:10.167456 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-ensure-env-vars/0.log" Mar 19 12:23:10.194044 master-0 kubenswrapper[17644]: I0319 12:23:10.193359 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-resources-copy/0.log" Mar 19 12:23:10.245530 master-0 kubenswrapper[17644]: I0319 12:23:10.245473 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_6bde080b-3820-463f-a27d-9fb9a7843d5d/installer/0.log" Mar 19 12:23:10.301594 master-0 kubenswrapper[17644]: I0319 12:23:10.301442 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-master-0_4ec000c4-5cc8-45b3-95ba-2856655f02f5/installer/0.log" Mar 19 12:23:10.981583 master-0 kubenswrapper[17644]: I0319 12:23:10.981497 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_cluster-image-registry-operator-5549dc66cb-nrtp2_a6fe3532-9dd4-42e4-b75c-2c3be0f3e5f1/cluster-image-registry-operator/0.log" Mar 19 12:23:10.999085 master-0 kubenswrapper[17644]: I0319 12:23:10.998533 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ksvww_cf8f5cd4-3f6a-43f6-bdbe-9fc79f015851/node-ca/0.log" Mar 19 12:23:11.465008 master-0 kubenswrapper[17644]: I0319 12:23:11.464947 17644 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-84vlf/perf-node-gather-daemonset-2czqc" Mar 19 12:23:11.619368 master-0 kubenswrapper[17644]: I0319 12:23:11.619274 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-qrjj4_163d6a3d-0080-4122-bb7a-17f6e63f00f0/ingress-operator/2.log" Mar 19 12:23:11.622033 master-0 kubenswrapper[17644]: I0319 12:23:11.621973 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-qrjj4_163d6a3d-0080-4122-bb7a-17f6e63f00f0/ingress-operator/1.log" Mar 19 12:23:11.635311 master-0 kubenswrapper[17644]: I0319 12:23:11.635261 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-qrjj4_163d6a3d-0080-4122-bb7a-17f6e63f00f0/kube-rbac-proxy/0.log" Mar 19 12:23:12.426887 master-0 kubenswrapper[17644]: I0319 12:23:12.426823 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-b8hzk_7f06b4ae-bfd4-465d-b2e2-465cc186cb4b/serve-healthcheck-canary/0.log" Mar 19 12:23:12.950112 master-0 kubenswrapper[17644]: I0319 12:23:12.950041 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-68bf6ff9d6-djfg8_034cad93-a500-4c58-8d97-fa49866a0d5e/insights-operator/0.log" Mar 19 12:23:14.596661 master-0 kubenswrapper[17644]: I0319 12:23:14.596515 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9/alertmanager/0.log" Mar 19 12:23:14.630921 master-0 kubenswrapper[17644]: I0319 12:23:14.630865 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9/config-reloader/0.log" Mar 19 12:23:14.657794 master-0 kubenswrapper[17644]: I0319 12:23:14.657652 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9/kube-rbac-proxy-web/0.log" Mar 19 12:23:14.679295 master-0 kubenswrapper[17644]: I0319 12:23:14.677758 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9/kube-rbac-proxy/0.log" Mar 19 12:23:14.697556 master-0 kubenswrapper[17644]: I0319 12:23:14.697467 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9/kube-rbac-proxy-metric/0.log" Mar 19 12:23:14.714644 master-0 kubenswrapper[17644]: I0319 12:23:14.714568 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9/prom-label-proxy/0.log" Mar 19 12:23:14.735917 master-0 kubenswrapper[17644]: I0319 12:23:14.735458 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ed7c7fd0-1772-43a4-b4b6-84dfe358f5b9/init-config-reloader/0.log" Mar 19 12:23:14.772446 master-0 kubenswrapper[17644]: I0319 12:23:14.771872 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-58845fbb57-tkcwh_681b76b7-7dcb-49df-bfdd-fe7e4bcb10e9/cluster-monitoring-operator/0.log" Mar 19 12:23:14.797853 master-0 kubenswrapper[17644]: I0319 12:23:14.797183 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7bbc969446-xkg9f_2d63d5a8-f45d-4678-824d-5534b2bcd6ca/kube-state-metrics/0.log" Mar 19 12:23:14.810443 master-0 kubenswrapper[17644]: I0319 12:23:14.810382 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7bbc969446-xkg9f_2d63d5a8-f45d-4678-824d-5534b2bcd6ca/kube-rbac-proxy-main/0.log" Mar 19 12:23:14.829630 master-0 kubenswrapper[17644]: I0319 12:23:14.829378 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7bbc969446-xkg9f_2d63d5a8-f45d-4678-824d-5534b2bcd6ca/kube-rbac-proxy-self/0.log" Mar 19 12:23:14.854516 master-0 kubenswrapper[17644]: I0319 12:23:14.852720 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7f479f8754-7s22b_33f7a977-41a3-4668-9cc4-1330f87bdd29/metrics-server/0.log" Mar 19 12:23:14.880199 master-0 kubenswrapper[17644]: I0319 12:23:14.880143 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-69dbdb4674-llwxn_ef4b0a53-dd65-40cf-adca-8ec46a55d28a/monitoring-plugin/0.log" Mar 19 12:23:14.906687 master-0 kubenswrapper[17644]: I0319 12:23:14.906627 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pnb9m_d06b230b-db67-4afc-8d10-2c33ad568462/node-exporter/0.log" Mar 19 12:23:14.918551 master-0 kubenswrapper[17644]: I0319 12:23:14.918504 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pnb9m_d06b230b-db67-4afc-8d10-2c33ad568462/kube-rbac-proxy/0.log" Mar 19 12:23:14.952647 master-0 kubenswrapper[17644]: I0319 12:23:14.952586 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pnb9m_d06b230b-db67-4afc-8d10-2c33ad568462/init-textfile/0.log" Mar 19 12:23:14.974341 master-0 kubenswrapper[17644]: I0319 12:23:14.974267 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5dc6c74576-lwqmn_dedf55c4-eeda-4955-aafe-db1fdb8c4a48/kube-rbac-proxy-main/0.log" Mar 19 12:23:14.999708 master-0 kubenswrapper[17644]: I0319 12:23:14.999649 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5dc6c74576-lwqmn_dedf55c4-eeda-4955-aafe-db1fdb8c4a48/kube-rbac-proxy-self/0.log" Mar 19 12:23:15.014810 master-0 kubenswrapper[17644]: I0319 12:23:15.014764 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5dc6c74576-lwqmn_dedf55c4-eeda-4955-aafe-db1fdb8c4a48/openshift-state-metrics/0.log" Mar 19 12:23:15.046548 master-0 kubenswrapper[17644]: I0319 12:23:15.046458 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ce7ed383-661e-4825-8eb0-ea529a90acc2/prometheus/0.log" Mar 19 12:23:15.062962 master-0 kubenswrapper[17644]: I0319 12:23:15.062895 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ce7ed383-661e-4825-8eb0-ea529a90acc2/config-reloader/0.log" Mar 19 12:23:15.084944 master-0 kubenswrapper[17644]: I0319 12:23:15.084895 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ce7ed383-661e-4825-8eb0-ea529a90acc2/thanos-sidecar/0.log" Mar 19 12:23:15.101467 master-0 kubenswrapper[17644]: I0319 12:23:15.101408 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ce7ed383-661e-4825-8eb0-ea529a90acc2/kube-rbac-proxy-web/0.log" Mar 19 12:23:15.115563 master-0 kubenswrapper[17644]: I0319 12:23:15.115438 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ce7ed383-661e-4825-8eb0-ea529a90acc2/kube-rbac-proxy/0.log" Mar 19 12:23:15.127393 master-0 kubenswrapper[17644]: I0319 12:23:15.127358 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ce7ed383-661e-4825-8eb0-ea529a90acc2/kube-rbac-proxy-thanos/0.log" Mar 19 12:23:15.149197 master-0 kubenswrapper[17644]: I0319 12:23:15.149142 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ce7ed383-661e-4825-8eb0-ea529a90acc2/init-config-reloader/0.log" Mar 19 12:23:15.179467 master-0 kubenswrapper[17644]: I0319 12:23:15.179401 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-6c8df6d4b-xfwkr_f4aad0ff-e6cd-4c43-9561-80a14fee4712/prometheus-operator/0.log" Mar 19 12:23:15.193872 master-0 kubenswrapper[17644]: I0319 12:23:15.193792 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-6c8df6d4b-xfwkr_f4aad0ff-e6cd-4c43-9561-80a14fee4712/kube-rbac-proxy/0.log" Mar 19 12:23:15.215284 master-0 kubenswrapper[17644]: I0319 12:23:15.215219 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-69c6b55594-89rdt_9778f8f5-b0d1-4967-9776-9db758bba3af/prometheus-operator-admission-webhook/0.log" Mar 19 12:23:15.240891 master-0 kubenswrapper[17644]: I0319 12:23:15.240826 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-66b5747dd4-gk5bw_213436b9-a964-4083-9187-65c82be4bb24/telemeter-client/0.log" Mar 19 12:23:15.254585 master-0 kubenswrapper[17644]: I0319 12:23:15.254543 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-66b5747dd4-gk5bw_213436b9-a964-4083-9187-65c82be4bb24/reload/0.log" Mar 19 12:23:15.272678 master-0 kubenswrapper[17644]: I0319 12:23:15.272632 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-66b5747dd4-gk5bw_213436b9-a964-4083-9187-65c82be4bb24/kube-rbac-proxy/0.log" Mar 19 12:23:15.301418 master-0 kubenswrapper[17644]: I0319 12:23:15.301365 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-747db875db-zc5nj_da6da885-6a82-47bd-a90f-ce81d8e78929/thanos-query/0.log" Mar 19 12:23:15.319771 master-0 kubenswrapper[17644]: I0319 12:23:15.319704 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-747db875db-zc5nj_da6da885-6a82-47bd-a90f-ce81d8e78929/kube-rbac-proxy-web/0.log" Mar 19 12:23:15.335101 master-0 kubenswrapper[17644]: I0319 12:23:15.335055 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-747db875db-zc5nj_da6da885-6a82-47bd-a90f-ce81d8e78929/kube-rbac-proxy/0.log" Mar 19 12:23:15.349419 master-0 kubenswrapper[17644]: I0319 12:23:15.349365 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-747db875db-zc5nj_da6da885-6a82-47bd-a90f-ce81d8e78929/prom-label-proxy/0.log" Mar 19 12:23:15.365036 master-0 kubenswrapper[17644]: I0319 12:23:15.364983 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-747db875db-zc5nj_da6da885-6a82-47bd-a90f-ce81d8e78929/kube-rbac-proxy-rules/0.log" Mar 19 12:23:15.378923 master-0 kubenswrapper[17644]: I0319 12:23:15.378804 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-747db875db-zc5nj_da6da885-6a82-47bd-a90f-ce81d8e78929/kube-rbac-proxy-metrics/0.log" Mar 19 12:23:15.889505 master-0 kubenswrapper[17644]: I0319 12:23:15.889444 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-866dc4744-dnx7f_ac09dba7-398c-4b0a-a415-edb73cb4cf30/kube-rbac-proxy/0.log" Mar 19 12:23:15.936917 master-0 kubenswrapper[17644]: I0319 12:23:15.936824 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-866dc4744-dnx7f_ac09dba7-398c-4b0a-a415-edb73cb4cf30/cluster-autoscaler-operator/0.log" Mar 19 12:23:15.952394 master-0 kubenswrapper[17644]: I0319 12:23:15.952336 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-942g6_92e401a4-ed2f-46f7-924b-329d7b313e6a/cluster-baremetal-operator/1.log" Mar 19 12:23:15.953248 master-0 kubenswrapper[17644]: I0319 12:23:15.953205 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-942g6_92e401a4-ed2f-46f7-924b-329d7b313e6a/cluster-baremetal-operator/2.log" Mar 19 12:23:15.961466 master-0 kubenswrapper[17644]: I0319 12:23:15.961405 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-942g6_92e401a4-ed2f-46f7-924b-329d7b313e6a/baremetal-kube-rbac-proxy/0.log" Mar 19 12:23:15.975814 master-0 kubenswrapper[17644]: I0319 12:23:15.975713 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6f97756bc8-j7rc9_7a51eeaf-1349-4bf3-932d-22ed5ce7c161/control-plane-machine-set-operator/1.log" Mar 19 12:23:15.976978 master-0 kubenswrapper[17644]: I0319 12:23:15.976932 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6f97756bc8-j7rc9_7a51eeaf-1349-4bf3-932d-22ed5ce7c161/control-plane-machine-set-operator/0.log" Mar 19 12:23:15.987647 master-0 kubenswrapper[17644]: I0319 12:23:15.987594 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-6fbb6cf6f9-jf7p6_75aedbcd-f6ed-43a1-941b-4b04887ffe8e/kube-rbac-proxy/0.log" Mar 19 12:23:15.999142 master-0 kubenswrapper[17644]: I0319 12:23:15.999081 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-6fbb6cf6f9-jf7p6_75aedbcd-f6ed-43a1-941b-4b04887ffe8e/machine-api-operator/0.log" Mar 19 12:23:17.013052 master-0 kubenswrapper[17644]: I0319 12:23:17.012998 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-46b4x_304c0dcc-9c40-4bf4-9c05-9d1a4601b15c/controller/0.log" Mar 19 12:23:17.025256 master-0 kubenswrapper[17644]: I0319 12:23:17.025195 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-46b4x_304c0dcc-9c40-4bf4-9c05-9d1a4601b15c/kube-rbac-proxy/0.log" Mar 19 12:23:17.052009 master-0 kubenswrapper[17644]: I0319 12:23:17.051952 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/controller/0.log" Mar 19 12:23:17.110138 master-0 kubenswrapper[17644]: I0319 12:23:17.110077 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/frr/0.log" Mar 19 12:23:17.123004 master-0 kubenswrapper[17644]: I0319 12:23:17.122936 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/reloader/0.log" Mar 19 12:23:17.133510 master-0 kubenswrapper[17644]: I0319 12:23:17.133425 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/frr-metrics/0.log" Mar 19 12:23:17.146424 master-0 kubenswrapper[17644]: I0319 12:23:17.146367 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/kube-rbac-proxy/0.log" Mar 19 12:23:17.157509 master-0 kubenswrapper[17644]: I0319 12:23:17.157431 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/kube-rbac-proxy-frr/0.log" Mar 19 12:23:17.172023 master-0 kubenswrapper[17644]: I0319 12:23:17.171974 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/cp-frr-files/0.log" Mar 19 12:23:17.190938 master-0 kubenswrapper[17644]: I0319 12:23:17.190817 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/cp-reloader/0.log" Mar 19 12:23:17.237912 master-0 kubenswrapper[17644]: I0319 12:23:17.237819 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tcwpc_75793445-a8c5-4cf7-8d0b-561fae8411fe/cp-metrics/0.log" Mar 19 12:23:17.253791 master-0 kubenswrapper[17644]: I0319 12:23:17.253127 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-rj2bt_8c5e6757-8626-4cd1-8736-b41978d173f1/frr-k8s-webhook-server/0.log" Mar 19 12:23:17.286354 master-0 kubenswrapper[17644]: I0319 12:23:17.286237 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7c8d7d7bcf-bpkwf_7bed1df9-51fd-4f70-95e9-4ec7333995d1/manager/0.log" Mar 19 12:23:17.303467 master-0 kubenswrapper[17644]: I0319 12:23:17.303407 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5b47fbc9b4-zf5jl_e87720f9-bdd2-4397-808c-b51869af7cfe/webhook-server/0.log" Mar 19 12:23:17.402157 master-0 kubenswrapper[17644]: I0319 12:23:17.402103 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-q82hx_63931868-d8f1-4227-b911-81d786835fbb/speaker/0.log" Mar 19 12:23:17.417255 master-0 kubenswrapper[17644]: I0319 12:23:17.417176 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-q82hx_63931868-d8f1-4227-b911-81d786835fbb/kube-rbac-proxy/0.log" Mar 19 12:23:18.695793 master-0 kubenswrapper[17644]: I0319 12:23:18.695737 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-598fbc5f8f-kb5vd_aaaaf539-bf61-44d7-8d47-97535b7aa1ba/cluster-node-tuning-operator/0.log" Mar 19 12:23:18.720746 master-0 kubenswrapper[17644]: I0319 12:23:18.720680 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-x6mmm_8376e1f9-ab05-42d4-aa66-284a167a9bfc/tuned/0.log" Mar 19 12:23:19.272629 master-0 kubenswrapper[17644]: I0319 12:23:19.272577 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-msc5g_0edff6cf-09c0-4eba-81fa-a4e78b150269/prometheus-operator/0.log" Mar 19 12:23:19.288369 master-0 kubenswrapper[17644]: I0319 12:23:19.288321 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-67df7f5c48-h2q2d_6229bbeb-842d-4465-962e-f8148d05cf6f/prometheus-operator-admission-webhook/0.log" Mar 19 12:23:19.302173 master-0 kubenswrapper[17644]: I0319 12:23:19.302116 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-67df7f5c48-snwnf_56241f4e-fa18-46a7-9be2-5b6d54cd4e26/prometheus-operator-admission-webhook/0.log" Mar 19 12:23:19.325254 master-0 kubenswrapper[17644]: I0319 12:23:19.325203 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-dwzhl_7906718f-0803-4c47-a275-e2e02feb34c3/operator/0.log" Mar 19 12:23:19.340757 master-0 kubenswrapper[17644]: I0319 12:23:19.340664 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5554d8fd8f-jkhd6_55660b4d-8a68-4062-bc7a-1216d9be2aa3/perses-operator/0.log" Mar 19 12:23:20.654026 master-0 kubenswrapper[17644]: I0319 12:23:20.653957 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-8b68b9d9b-pjc7h_39d3ac31-9259-454b-8e1c-e23024f8f2b2/kube-apiserver-operator/0.log" Mar 19 12:23:20.675528 master-0 kubenswrapper[17644]: I0319 12:23:20.675470 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-8b68b9d9b-pjc7h_39d3ac31-9259-454b-8e1c-e23024f8f2b2/kube-apiserver-operator/1.log" Mar 19 12:23:21.326622 master-0 kubenswrapper[17644]: I0319 12:23:21.326546 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-3-master-0_1c576a88-6da4-43e9-a373-0df27a029f59/installer/0.log" Mar 19 12:23:21.348715 master-0 kubenswrapper[17644]: I0319 12:23:21.348639 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-5-master-0_2dc94fa1-b977-4f27-9a30-f9dc6cbbfe92/installer/0.log" Mar 19 12:23:21.374462 master-0 kubenswrapper[17644]: I0319 12:23:21.374392 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-6-master-0_dff9f91a-2293-4b2d-95dd-be0f9152984e/installer/0.log" Mar 19 12:23:21.404766 master-0 kubenswrapper[17644]: I0319 12:23:21.404688 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-7-master-0_7def3099-f487-44d4-a1d5-2ae096ef8804/installer/0.log" Mar 19 12:23:21.585990 master-0 kubenswrapper[17644]: I0319 12:23:21.585768 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_3cae843f2a8e3c3c3212b1177305c1d5/kube-apiserver/0.log" Mar 19 12:23:21.602156 master-0 kubenswrapper[17644]: I0319 12:23:21.602100 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_3cae843f2a8e3c3c3212b1177305c1d5/kube-apiserver-cert-syncer/0.log" Mar 19 12:23:21.629382 master-0 kubenswrapper[17644]: I0319 12:23:21.629306 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_3cae843f2a8e3c3c3212b1177305c1d5/kube-apiserver-cert-regeneration-controller/0.log" Mar 19 12:23:21.644209 master-0 kubenswrapper[17644]: I0319 12:23:21.644152 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_3cae843f2a8e3c3c3212b1177305c1d5/kube-apiserver-insecure-readyz/0.log" Mar 19 12:23:21.663636 master-0 kubenswrapper[17644]: I0319 12:23:21.663584 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_3cae843f2a8e3c3c3212b1177305c1d5/kube-apiserver-check-endpoints/0.log" Mar 19 12:23:21.675225 master-0 kubenswrapper[17644]: I0319 12:23:21.675170 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_3cae843f2a8e3c3c3212b1177305c1d5/setup/0.log" Mar 19 12:23:21.694166 master-0 kubenswrapper[17644]: I0319 12:23:21.694076 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_revision-pruner-7-master-0_6daa7b3f-6abd-410f-a040-dcf6bf5521c7/pruner/0.log" Mar 19 12:23:21.994848 master-0 kubenswrapper[17644]: I0319 12:23:21.994798 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-84qmh_58c25e67-f473-47f4-b461-e25d7761c102/cert-manager-controller/0.log" Mar 19 12:23:22.007658 master-0 kubenswrapper[17644]: I0319 12:23:22.007591 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-rr2cw_a3fa634e-d357-45b8-b0bd-8ab6b961de7b/cert-manager-cainjector/0.log" Mar 19 12:23:22.025754 master-0 kubenswrapper[17644]: I0319 12:23:22.025668 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-qgj2h_d98f87fe-cea4-41a5-9000-743954979694/cert-manager-webhook/0.log" Mar 19 12:23:22.605353 master-0 kubenswrapper[17644]: I0319 12:23:22.605275 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-6864dc98f7-xzxpq_376b18a9-5f33-44fd-a37b-20ab02c5e65d/kube-rbac-proxy/0.log" Mar 19 12:23:22.625023 master-0 kubenswrapper[17644]: I0319 12:23:22.624967 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-6864dc98f7-xzxpq_376b18a9-5f33-44fd-a37b-20ab02c5e65d/manager/2.log" Mar 19 12:23:22.625588 master-0 kubenswrapper[17644]: I0319 12:23:22.625558 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-6864dc98f7-xzxpq_376b18a9-5f33-44fd-a37b-20ab02c5e65d/manager/1.log" Mar 19 12:23:23.182806 master-0 kubenswrapper[17644]: I0319 12:23:23.182740 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-84qmh_58c25e67-f473-47f4-b461-e25d7761c102/cert-manager-controller/0.log" Mar 19 12:23:23.197807 master-0 kubenswrapper[17644]: I0319 12:23:23.197750 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-rr2cw_a3fa634e-d357-45b8-b0bd-8ab6b961de7b/cert-manager-cainjector/0.log" Mar 19 12:23:23.219316 master-0 kubenswrapper[17644]: I0319 12:23:23.219243 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-qgj2h_d98f87fe-cea4-41a5-9000-743954979694/cert-manager-webhook/0.log" Mar 19 12:23:23.699329 master-0 kubenswrapper[17644]: I0319 12:23:23.699276 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-5v79v_bda90bb9-a85d-4dba-b00b-7721557694bc/nmstate-console-plugin/0.log" Mar 19 12:23:23.720689 master-0 kubenswrapper[17644]: I0319 12:23:23.720641 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-pbgxp_e6f7e8f5-8cca-400e-9eea-2961d3f9920f/nmstate-handler/0.log" Mar 19 12:23:23.738427 master-0 kubenswrapper[17644]: I0319 12:23:23.738355 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-7hz8v_defe43e5-1621-42e7-9e79-bc48c2bbfb5c/nmstate-metrics/0.log" Mar 19 12:23:23.754794 master-0 kubenswrapper[17644]: I0319 12:23:23.754742 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-7hz8v_defe43e5-1621-42e7-9e79-bc48c2bbfb5c/kube-rbac-proxy/0.log" Mar 19 12:23:23.774957 master-0 kubenswrapper[17644]: I0319 12:23:23.774899 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-8ht49_8de64a53-181c-4b60-a814-c8f104593009/nmstate-operator/0.log" Mar 19 12:23:23.789306 master-0 kubenswrapper[17644]: I0319 12:23:23.789222 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-94bfp_8d8c7433-5218-4713-9e76-1c94175acd1c/nmstate-webhook/0.log" Mar 19 12:23:24.420546 master-0 kubenswrapper[17644]: I0319 12:23:24.420490 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-552pc_09a22c25-6073-4b1a-a029-928452ef37db/kube-multus/0.log" Mar 19 12:23:24.446522 master-0 kubenswrapper[17644]: I0319 12:23:24.446440 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n8vwk_bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a/kube-multus-additional-cni-plugins/0.log" Mar 19 12:23:24.463971 master-0 kubenswrapper[17644]: I0319 12:23:24.463868 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n8vwk_bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a/egress-router-binary-copy/0.log" Mar 19 12:23:24.478598 master-0 kubenswrapper[17644]: I0319 12:23:24.478534 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n8vwk_bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a/cni-plugins/0.log" Mar 19 12:23:24.494463 master-0 kubenswrapper[17644]: I0319 12:23:24.494373 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n8vwk_bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a/bond-cni-plugin/0.log" Mar 19 12:23:24.508570 master-0 kubenswrapper[17644]: I0319 12:23:24.508473 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n8vwk_bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a/routeoverride-cni/0.log" Mar 19 12:23:24.533354 master-0 kubenswrapper[17644]: I0319 12:23:24.533264 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n8vwk_bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a/whereabouts-cni-bincopy/0.log" Mar 19 12:23:24.574980 master-0 kubenswrapper[17644]: I0319 12:23:24.574909 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n8vwk_bd8ff97d-047e-4ea7-ba6c-9fbc5da0514a/whereabouts-cni/0.log" Mar 19 12:23:24.589973 master-0 kubenswrapper[17644]: I0319 12:23:24.589907 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-649577484c-phc8b_eb7a7fb9-31c8-4038-81bb-c3f8ca2ec204/multus-admission-controller/0.log" Mar 19 12:23:24.605167 master-0 kubenswrapper[17644]: I0319 12:23:24.605112 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-649577484c-phc8b_eb7a7fb9-31c8-4038-81bb-c3f8ca2ec204/kube-rbac-proxy/0.log" Mar 19 12:23:24.636066 master-0 kubenswrapper[17644]: I0319 12:23:24.636006 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-f6wv7_f29b11ce-60e0-46b3-8d28-eea3452513cd/network-metrics-daemon/0.log" Mar 19 12:23:24.647130 master-0 kubenswrapper[17644]: I0319 12:23:24.647075 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-f6wv7_f29b11ce-60e0-46b3-8d28-eea3452513cd/kube-rbac-proxy/0.log" Mar 19 12:23:25.245462 master-0 kubenswrapper[17644]: I0319 12:23:25.245394 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_lvms-operator-7d575f666-kbbmh_130b56a7-a29e-4cc4-8464-bd5353d37d7a/manager/0.log" Mar 19 12:23:25.265181 master-0 kubenswrapper[17644]: I0319 12:23:25.265113 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-7fdjj_353e7c87-ddee-472d-8a41-a4fc62ded137/vg-manager/1.log" Mar 19 12:23:25.267563 master-0 kubenswrapper[17644]: I0319 12:23:25.267521 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-7fdjj_353e7c87-ddee-472d-8a41-a4fc62ded137/vg-manager/0.log" Mar 19 12:23:25.947464 master-0 kubenswrapper[17644]: I0319 12:23:25.947372 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_6602edde-61c4-4316-a2ca-a21c764eb590/installer/0.log" Mar 19 12:23:25.982056 master-0 kubenswrapper[17644]: I0319 12:23:25.981969 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-4-master-0_47d6a091-6854-4e44-8e7c-b2089cae286c/installer/0.log" Mar 19 12:23:26.181314 master-0 kubenswrapper[17644]: I0319 12:23:26.181271 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_cd89f86c9be90c18d6ac0ac77e416132/kube-controller-manager/0.log" Mar 19 12:23:26.254261 master-0 kubenswrapper[17644]: I0319 12:23:26.254106 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_cd89f86c9be90c18d6ac0ac77e416132/cluster-policy-controller/0.log" Mar 19 12:23:26.270507 master-0 kubenswrapper[17644]: I0319 12:23:26.270434 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_cd89f86c9be90c18d6ac0ac77e416132/kube-controller-manager-cert-syncer/0.log" Mar 19 12:23:26.290746 master-0 kubenswrapper[17644]: I0319 12:23:26.290635 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_cd89f86c9be90c18d6ac0ac77e416132/kube-controller-manager-recovery-controller/0.log" Mar 19 12:23:27.302028 master-0 kubenswrapper[17644]: I0319 12:23:27.301886 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-ff989d6cc-5gvgh_dbcbba74-ac53-4724-a217-4d9b85e7c1db/kube-controller-manager-operator/1.log" Mar 19 12:23:27.316842 master-0 kubenswrapper[17644]: I0319 12:23:27.316789 17644 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-ff989d6cc-5gvgh_dbcbba74-ac53-4724-a217-4d9b85e7c1db/kube-controller-manager-operator/0.log"